Why are the pages on the site not indexed or do not want to get into the issue for a long time? This is an eternal problem that should be analyzed. Next - on the main points that prevent normal indexing.
Causes of problems with indexation
# 1. The robot does not pay attention to your site / page. This may be due to various factors.
Especially if the resource is young, it may take some time for the robot to see it. He needs help in this: add a website / page to search engines adurilki, put a couple of links from other pages, sites, social networks. The robot, walking on the links, will come across your resource.
№2 The site has been added to the black list
Your domain may have a bad reputation. You or someone else could use the methods of "black" SEO, for which the site was sanctioned. If it is created exclusively for robots and does not carry benefits for people, or is an affiliate of another resource, then its indexing may cease. Buying domains with an unknown history, you are very much at risk, so it is best to acquire a young, "clean" domain name.
In addition to bad karma, the presence of malicious code can affect indexing. It appears after hacking the site or when installing a free template. Watch your resource through the panel of webmasters Yandex and Google.
№3 Errors in the code.
- Server response is different from "200". To check how your server responds to robots, use this Yandex tool: https: // webmaster. yandex. ru / server-response. xml.
- Incorrect DOCTYPE. Make sure that the site code meets html standards and does not have errors.
- Redirect 302. This is a temporary redirect, with the help of which the old pages are not changed to new ones in the index. Instead, you must use the 301st, permanent redirect.
- Incorrect encoding. Make sure that search engines can read the encoding correctly. Otherwise, the robot will receive a meaningless set of characters instead of the Russian text and consider it as low-grade content that is not worthy of indexing.
No. 4. The robot can not index the site.
Check whether all the necessary pages are allowed to be indexed, whether they can be accessed. It may be closed in many ways. Here are the most common reasons.
- Meta-robots meta tags may restrict access. On the page after the tag, this meta tag can contain the forbidding construction content = ”nofollow, noindex”, due to which the robot refuses to index the document / site.
- Domain not delegated. Make sure the site is really accessible to visitors. In addition, check if the domain has been removed from delegation due to the placement of pirated content (for example, movies or audio tracks).
- The site is closed in robots. txt. Accidentally or not, some sections / pages that you would like to index can be closed. For example, the entry “Disallow: /” may mean that the site is completely closed for robots.
Problems with IP or User-Agent. In this case, the problem is in your hoster. You can check if the site has visited the search bot. using server log analysis (file access. log). Typically, the log folder (log) is located in the root directory of the site.
- X-ROBOTS-TAG. Check if this meta tag is on your pages. This is quite a rare way control, but pay attention to it. X-ROBOTS-TAG works at the server header level and affects the contents of various types In this way, for example, you can hide an image from indexing:
- Closing with the noindex tag. Perhaps due to the conclusion of unique content In this tag, the robot does not have enough page content to add it to the issue.
No. 5 Other problems with indexing.
- Template texts, snippets , headlines in which only a couple of words change;
- Internal content duplicates;
- The page nesting level is too deep (important for indexing the page should be taken on level 2-3);
- Non-unique texts;
- Slow website performance due to for CMS or hosting;
- The volume of unique articles does not exceed 500 characters;
- A large number of errors 404.
Indexation problems may arise for other reasons not described here. If you can not find them yourself, it is advisable to contact the experts.
For the rest, in order for pages to fly into the index more quickly, it is necessary to teach the robot to enter the site at the same time. To do this, you need to publish content with constant regularity and accuracy and do it more often than 1-2 times a month. In addition, do not forget to leave high-quality links to new materials both within the project and beyond.
Like articles? Subscribe to the newsletter!