February 24, 2011 turned the entire Google search engine burzhunet. In the first place, this affected the United States. The ranking of sites has undergone major changes. Many lost more than 50% of traffic. The new algorithm Google Panda has made its own adjustments. April 11, Panda came to the UK. It was monday. Many optimizers, who analyzed the issue over the weekend, received a “pleasant” surprise.
What kind of animal is this? And is it so afraid of its Russian segment? If you say simply, then Google Panda is like everyone else the well-known Yandex AGS , whose main task is to ban low-grade sites, copy-paste sites, and other HS. But, unlike Yandex, the set of parameters for filtering sites is somewhat larger.
Panda is not such and a terrible beast in contrast to the unceremonious AGS. Having made a number of simple changes on the site, you can get out of the filter. say that Google Panda is a tough SEO guru recommendation. Let us prepare for a panda meeting. What you need to know?
The first thing I want to point out is that these recommendations are not approved by Google experts or other SEO gurus. Panda still in the process of improvement. And although version 2. 2 is already working, it’s too early to talk about stable tools for influencing it.In addition, each site has a number of parameters that Google algorithms take into account. This means that some advice can help, and others can not.
1. Manual audit of the entire site
Forget for a while about all sorts of utilities, software and more. Make screenshots of the main, pages of articles, archives, etc. Take a red pen and start to analyze. What can be changed, deleted, added, simplified or interchanged?
2. The simpler the better.
3. Do you really need all these pages?
Google has clearly stated that low-grade pages may reduce the site’s output. What are these low-grade pages?
4. Greed on earnings from advertising
This refers to the excess of ads on the page. Advertisements should inform visitors, and not be imposed on them. An excess of ad units on a page is a bad indicator for a panda.
5. Content quality is key
How much has been said about this lately in the Russian segment. The issue of content optimization is weightier. And how to be ordinary users who create their sites for themselves? They are not aware of all the wisdom of SEO. We all understand that any PS is primarily for commercial gain. And such changes are primarily needed for commercial resources or close to them. In any case, the content should be white.
6. META data, description and other
There is a version that meta data does not have any effect on the issue. Hmm ... there is no proof that it is. Therefore, it’s still worth writing the description and keywords.
7. Duplicate content
Everything is clear here. Whether it is content on subdomains, or on the site itself - this is bad. Pages must be original.
8. Mobile version of the site
Tough measures require hard changes. Unfortunately, this item can be attributed to duplicate content. After all, a simplified version of the site is a clone to some extent. Therefore, many optimizers have abandoned this version until they find an alternative solution. As an option, it is a flexible work of scripts for determining the source and correct substitution of the desired template for mobile devices. Minus - the generation of dynamic pages.
9. Errors in the code
If the site is created manually (with the write engine), then there may be errors in the site code. Google also fixes this. This also applies to obsolete code and its elements (in CSS and HTML). It is necessary to periodically analyze the code for options for its optimization, reduction and so on. The easier the site, the better.
A design question is not just a visual representation of the site. It affects the speed of loading pages, and usability, and user loyalty (liking). Now it is fashionable to have websites simple in design.