Before reading the article of Rand Fishkin on SEOmoz, I thought that cloaking in any form is an exceptionally black promotion method that you can get from the all-seeing and all-knowing Google. But I was wrong (which, in general, is not at all surprising). I just didn’t have the sense to attach the main characteristic of cloaking - issuing different content to search bots and ordinary users - to white sites that I use every day.
* This post will be entirely devoted to various types of cloaking and to which of them Google tolerates and which it does not tolerate. There will be no “themes” here, just interesting information for expanding the seo-outlook.
Here is a not weak list of weak websites that successfully use cloaking for their own purposes:
This list goes on and on, but the essence is clear without it: cloaking is not always evil , especially if you are:
Here’s a classification of cloaking types from Rand Fishkin:
1. Crystal white :
On the same SEOmoz there is a paid subscription, the contents of which only its members can see. Technically, search bots and some visitors see different things, but they are all based on cookies and executed in strict accordance with the requirements of search engines.
2. Almost white :
Geo-Targeting dabbles in Craigslist. org. Google representatives openly stated that as long as Craigslist shows bots the same pages as visitors, this behavior of the site does not violate the rules. Only here the bots see the wrong pages that visitors - logically they should not see them, because the bot needs to feed all the content of the site, and not just ads under New York or San Francisco. However, such a violation of the rules does not lead to serious consequences, because, by and large, both users and search engines get what they need.
3. Slightly gray :
On sites where large articles are placed, as a rule, there are links like “read the article” or “print the article”, etc. in the same spirit. When several links from different sources (blogs, social sites) appear on these versions of the same article, search engines start indexing them and penalizing for duplicate content. To avoid this, secondary 301 redirects are put on secondary links, and then the entire reference weight is sent to the article itself. But again, despite the violation of the rules, this method benefits both visitors (they switch to the original article) and search engines (Yahoo and MSN still have problems with the definition of duplicate content). That is, authoritative sites for it certainly does not threaten anything.
4. Dark Gray :
Some affiliates, instead of redirecting the reference weight from affiliate links to the pages where they should lead, redirect it to pages that need to be fed with links and advanced in the issue. Such gestures are very close to the black methods, so search engines are punished for this.
5. Black :
Here, Rand gives an example of how he entered the query inurl: sitemap in the search. xml and in 9th place in the issue found the link www. acta-endo. ro / new / viagra / sitemap. xml. html, which led him ... but you yourself know where such links lead For this, search engines ban sites as soon as they are found.
Conclusion: All that is white or slightly gray is accepted by search engines with a bang, despite the public statements of some talking heads, since it does not violate the interests of the parties. For everything else banyat.