Cloaking is evil? - Profit Hunter

Cloaking is evil? - Profit Hunter Before reading the article of Rand Fishkin on SEOmoz, I thought that cloaking in any form is an exceptionally black promotion method that you can get from the all-seeing and all-knowing Google. But I was wrong (which, in general, is not at all surprising). I just didn’t have the sense to attach the main characteristic of cloaking - issuing different content to search bots and ordinary users - to white sites that I use every day.

* This post will be entirely devoted to various types of cloaking and to which of them Google tolerates and which it does not tolerate. There will be no “themes” here, just interesting information for expanding the seo-outlook.

Here is a not weak list of weak websites that successfully use cloaking for their own purposes:

  • Google (yeah, it's the most). Look for Google’s Google services like google translate , adwords , adsense , etc., and see which URLs it gives search engine. Now go to the link and see what the page address turns into as it loads. Moreover, the content of the landing page is sometimes different from what is stored with the cache, regardless of whether you are logged in Google or not.
  • NYTimes. com . Built-in advertising pages, an invitation to log in / register on the site after 5 clicks, archive data - a person sees it differently than search bots.
  • Forbes. Com . On this site, it’s impossible to even look at the main page without first enjoying the commercial on a half-screen. Cached pages are also often different from what people see on the site.
  • Wine. com . Even if you do not take into account the redirects, each user before you look at the prices (as, incidentally, the page), must choose the state. It is hardly required from search bots.
  • WebmasterWorld. com . This site was one of the first to introduce the (now generally accepted) practice of a few free clicks on the site, after which there is a proposal to join the site. But this is not applicable to Google bots.
  • Yelp. com . Active use of cookie-based geo-targeting.
  • Amazon. com . Thanks to cookies and data from past visits, Amazon can vary, for example, the issue of recommended products.
  • iPerceptions. com . In fact, the site does not use cloaking, but the pop-up on the site is visible only from those computers where browsers support cookies (and not only on this site, but also on a hundred or so others). not to mention that the site belongs to one of Google employees.
  • ComputerWorld. com . If you were a bot, you would never see ads on the whole screen, popups, and even some javascripts that only work when a real person visits the site.
  • Scribd. com . The machine sees on the site only html-text. Man sees what he sees.
  • Nike. com . Google shows that the site consists of more than one and a half million pages. But for some reason all these pages start with the same flash-movie.
  • Wall Street Journal . The Google bot doesn’t see the pay to access link at all, after the first paragraph, on some articles on the site.

This list goes on and on, but the essence is clear without it: cloaking is not always evil , especially if you are:

  • large and worldwide famous brand; or
  • mask pages so that not only search bots but also live visitors of the site benefit from it.

Here’s a classification of cloaking types from Rand Fishkin:

1. Crystal white :

  • Permitted methods: cookie recognition, javascripts.
  • Goal: optimize landing pages, provide content to registered users.

On the same SEOmoz there is a paid subscription, the contents of which only its members can see. Technically, search bots and some visitors see different things, but they are all based on cookies and executed in strict accordance with the requirements of search engines.

2. Almost white :

  • Allowed tricks: all above + User Agent.
  • Purpose: geotargeting, determination of the type of browser, minimization of traffic to bots.

Geo-Targeting dabbles in Craigslist. org. Google representatives openly stated that as long as Craigslist shows bots the same pages as visitors, this behavior of the site does not violate the rules. Only here the bots see the wrong pages that visitors - logically they should not see them, because the bot needs to feed all the content of the site, and not just ads under New York or San Francisco. However, such a violation of the rules does not lead to serious consequences, because, by and large, both users and search engines get what they need.

3. Slightly gray :

  • Allowed tricks: whatever is above, + User Agent / view IP address information.
  • Target: redirect link weight to similar pages, display hidden content.

On sites where large articles are placed, as a rule, there are links like “read the article” or “print the article”, etc. in the same spirit. When several links from different sources (blogs, social sites) appear on these versions of the same article, search engines start indexing them and penalizing for duplicate content. To avoid this, secondary 301 redirects are put on secondary links, and then the entire reference weight is sent to the article itself. But again, despite the violation of the rules, this method benefits both visitors (they switch to the original article) and search engines (Yahoo and MSN still have problems with the definition of duplicate content). That is, authoritative sites for it certainly does not threaten anything.

4. Dark Gray :

  • Allowed Tricks: all.
  • Purpose: showing less optimized content, redirecting the reference weight to the left pages.

Some affiliates, instead of redirecting the reference weight from affiliate links to the pages where they should lead, redirect it to pages that need to be fed with links and advanced in the issue. Such gestures are very close to the black methods, so search engines are punished for this.

5. Black :

  • Allowed Tricks: all.
  • Purpose: to show content that does not correspond to the search query at all.

Here, Rand gives an example of how he entered the query inurl: sitemap in the search. xml and in 9th place in the issue found the link www. acta-endo. ro / new / viagra / sitemap. xml. html, which led him ... but you yourself know where such links lead For this, search engines ban sites as soon as they are found.

Conclusion: All that is white or slightly gray is accepted by search engines with a bang, despite the public statements of some talking heads, since it does not violate the interests of the parties. For everything else banyat.

Related entries:

  • As you call a ship, it will float
  • Useful tools for studying the site’s target audience
  • Iron method of increasing search traffic
  • 10 ways to quickly increase blog traffic

Do you like articles? Subscribe to the newsletter!

Search

Related Articles