Just like Yandex, Google has its own “gifts” for poor-quality sites. Filters are imposed for various offenses, which will be discussed further. They differ from Yandex filters, but in general, the principles of superposition in both search engines are similar. Consider what Google filters provided for, how not to fall under them, and how to treat the site if the overlapping has already happened.
- Panda. Reoptimization, non-unique articles, non-thematic advertisements, duplicates, the same meta description for different pages - this is what you can meet with "Panda". If you notice a drop in traffic, a decrease in the position of pages in the output, it is possible that this beast has bothered over your website. Getting out of the filter is not easy, because you need to find and eliminate the factors that influenced its imposition.
- Penguin. He punishes links, both incoming and outgoing. Links from spam sites, link exchange, through links, uneven link mass on pages, high link growth rate - all this is punishable by a sharp drop in the site in search. The solution to the problem is to “clean up” the site from a bad link mass. By the way, for incoming links Google came up with a tool called Disavow links. It allows you to reject poor incoming links so that they will not be taken into account by the search engine in the future. You can find it in the webmaster tools from Google.
- Auxiliary results. The filter is applied to pages with duplicate content. The doubles that fall under this filter are called "snot" among optimizers. It is necessary to clean the site from them, redirect from them to the main pages, and Google will gradually, within a few months, remove the pages from the "snot", and the site will begin to develop.
- Bombing. If you buy links with the same anchor, Google simply ceases to consider these links, stopping the growth of positions. The treatment is simple: remove links with the same anchor and purchase others so that the anchors are different.
- "Too many links at once" . This is the so-called reference explosion, for which Yandex is also punishing. In this case, Google ignores links purchased. Only one way out: to increase the reference mass as accurately as possible and naturally.
- Sandbox . This includes young resources up to 3 months. You can fight the sandbox in a variety of ways. For example, carefully purchase links so that they are natural, high-quality and there was no reference explosion. You can also link internal pages.
- Domain Name Age . Sites with a domain of up to a year can be poorly indexed. Such a filter is necessary to prevent artificial raising of the resource in the issue. To get out of the filter, try buying links from trust and old sites or buy a domain older than 12 months.
- Bowling . Here the envy of competitors takes place: they begin to steal content, make cheats, put links to the site from resources under a ban, etc. The filter is especially tricky if the site is young and has a low level of trust among the PS. To get the site out of Bowling, you’ll have to communicate with Google’s support.
- Filter for broken links . When it turns out that the site has a huge number of links with an error of 404, the site is underreported. For treatment it is necessary to remove all useless links leading to the void.
- Links . If the site is a reference donor, and it has a set of outgoing links on the same page, it decreases in output, and the useful effect of all outgoing links leading to acceptor sites is leveled. If there are more than 25 links on the page, immediately remove such a page or close it so that it is not indexed.
- Page Load Time . Too long loading the site affects the fall of his issue. To remove a filter, you need to lighten the resource from complex code and elements, speed it up with other methods.
- Omitted Results . A lot of duplicates, poor-quality linking, low content uniqueness - all this tells Google that the site should be dropped in the issue. To get rid of the filter, you need to bring the resource in order and build up quality links.
- Minus 30 . The site falls in search of 30 positions for cloaking, the presence of doorways, hidden redirection via java script, reference explosion, link-washer. To get out of the filter, simply remove all these negative factors.
- Over Optimization . Too many keys cause a noticeable drop in the resource in the search results. The solution is to reduce the density of keywords in the texts of the site.
- Duplicate Content . Well protect the site from content theft, place only unique texts. This will avoid its pessimization in extradition.
As you can see, it is important to pay more attention to the quality of the site, rather than the number of links and the race for meaningless "black" optimization. It is important to like people, not search robots.Behavioral factors are now in priority, and this trend is evolving along with a strongly growing competition between multiple sites.