In this video, Andy Jenkins tells how Google calculates duplicate content and what needs to be changed in the site so that search engines regularly index pages and not penalize them for non-original content.
Fighting duplicate content, coupled with a doubling of search traffic, will surely bring tangible results.
Now you may not even suspect that Google or any other search engine is imposing sanctions for duplicate content on individual pages of your site. And you will know about it only when your position in the issue will begin to slowly but surely slide down. Moreover, if you do not take any action, Google can remove the site from the main search altogether, so you can only find it by the domain name.
Andy considers duplicate content to be one of the main problems faced by both newbies and experienced webmasters. According to him, about 75% of his clients somehow suffered from this problem.
Online stores that sell hundreds and thousands of physical or informational goods of various brands.
The fact is that you can create duplicate pages, even without knowing about it. However, this is not important. The main thing is that after watching this video, you once and for all rid your site of duplicate pages and thus only increase your earnings on the Internet.
At the beginning of the video, Andy explains some highlights:
1. The most dangerous type of duplicate content is the one that is within the same site.
Duplicates from other sites also can harm your resource, however repetitions within one site will definitely do it.
2. By content is meant text , which see the posts bots. When working on duplicate content, the images and even the text inside the “alt” tag are not taken into account.
3. The key to success, as is the case with any other aspect of SEO, is testing . Andy gives you a base from which you will build on your site, but only by testing and tracking the results you can squeeze the maximum out of the site.
What is duplicate content?
These are two or more pages on one site that look so similar to search engines that they don't consider it necessary to index them.
The search bot, according to Andy, reacts like this: "What the hell am I going to foul up my screw with the same pages, and even from the same domain?”
What the search engine refuses to index such pages, is fraught with unpleasant consequences for you:
All together, this leads to the fact that the site can not rise in the issuance of key requests . Even worse, if the search engine begins to consider duplicates half (or more) of the site pages. In many cases, this leads to a slow but sure death of the site, and then your site can be found only by typing the domain name in the search bar.
To solve a problem, you must first look at it with the eyes of the person who directly creates it, that is, the robot. The robot sees the page a little differently than we do:
(Do not worry, you practically do not have to conjure code over the code.)
Each page of the site, in particular, of the site Andy consists of the basic elements:
Together these elements make up the site template. Each of them, except the content, is repeated on all pages and on almost all pages they look the same. The only thing that distinguishes pages is their filling .
(Andy Jenkins himself was the model for the photo.)
Once on the site, we quickly get used to the template and after that we see only what’s on pages looks different, i.e. we see content and do not notice the structure molecular page elements (red).
Among other things, this feature allows the perception concentrate visitor to your message (in this case, the description chainmail).
Search bots, unlike humans, do not know how to memorize duplicate site elements, so when evaluating a new page for duplicate content, the robot takes into account the text content 3> page elements.
If two pages look like this:
then the person will be able to notice the difference. And the robot is not. He will simply accept a couple of lines of text that distinguish the page for the error and give the new page an unambiguous diagnosis: “Duplicate Content.”
So we have the following:
However, it is the robot decides what place the page will occupy in the output and whether it will be indexed at all, therefore, you have to give it what it wants from you.
Now by P Unktam:
1. Count the number of words in the page template (all elements except content).
To do this, select all the text on the page (Edit - Select All (Ctrl + A)) and copy it (Edit - Copy (Ctrl + C)) to Word. Now use the Tools menu - Statistics. See how many words your template contains.
For example, Andy's template contains just over 220 words. How did he do it? About this a little later.
In the meantime, write down or memorize the number you just learned.
Your task is to create such text content so that the number of words in it exceeds the number of words in the template elements.
Thanks to this simple action, the page will become unlike any other, and the chances that the search robot will recognize it as a duplicate will decrease to a minimum.
2. Change the title of the page (the contents of the tag. Title.). You should not have two pages with the same headings.
This is the only time you have to work a little with the code.
Important: Do you think the following headings are the same or not?
For a person, yes, but But for the car - not necessarily.
In SEO, there is such a thing as stop words . These are words or short phrases that search engines do not take into account when determining relevance and ranking pages.
What does this have to do with you? If you made the content of the pages unique, but the search engine persistently refuses to index the page, the problem may lie in its title, namely in the stop words it contains. If page headers differ only in stop words, you can assume that you have found the source of your troubles. Here is a list of them.
The same list can be found in the Stompernet blog. Better yet, copy it to your hard drive and print it.
3. Where possible, replace the text with a picture .
That's why Andy has so few words in the pattern. Much of the text is depicted as pictures that a person can “read", but a machine cannot. Look at the picture.
That's right, this is text. But Andy cannot select it So, what is it? That's right, picture . And this is how Andy “deceives” search robots and reduces the number of words in a pattern. The robot can neither see, nor understand, nor count these words.
Another example: a site dedicated to housing mortgages. US law requires that on every page of the site of the mortgage company, hang a huge disclaimer like the one in the picture.
This is 800 words of absolutely identical text on each page. Naturally, it is better to replace it with a picture. But ...
Disclaimer must be present on the site as text so that search bots can detect it and so that the visitor can copy it to his computer if he wants to. In this case, you must create a separate page and place the text there. And in order for the text to be found by the visitor (the bot will find it himself), attach an ALT tag with such content to each graphic image of the disclaimer.
And the wolves are fed and the sheep are safe.
In the final part of the video, Andy debunks the myth that the weight of links from other sites is somewhat higher than the weight of links from the internal pages of sites.
Andy claims that all other things being equal, their weight is the same . (Someone, but Andy can be trusted.;)) That is why you need to exert maximum effort so that the search robot does not consider your internal pages to be duplicates of the main one. It is likely that you already have a significant number of pages with duplicate content that can be returned to the index right now. And if you do this, the pages (including the main one) will almost immediately begin to rise in the issue. And along with the main one, minor ones will start to rise (see the translation of the previous video).
Stay Connected! This is not the last video from Andy Jenkins.
p. with. Hope you enjoyed this article. If yes, I will be very grateful if you contribute to its socialization. Write a comment, bookmark, vote on News2. 0, report the article in your blog, copy it to your blog (with preservation of links).
Thanks in advance!