This article will be useful to those site owners who have more than a hundred or two pages of their “online property” and who are already seriously beginning to think about the need to improve the quality of internal optimization, since the main engines are usually not provide such functionality in their database.
Not a single website owner respecting himself has the question “need or not to create an external reference network to promote their resources”. Everyone knows that this is a necessity, without which any meaning is lost to invest in the creation / purchase and development of the site, unless, of course, you are not engaged in altruism.
So why on many sites, even occupying their place in niches quite well, is the issue of internal optimization still not resolved? Owners of these sites do not need more page views of their resources or more time spent by visitors on their resources? However, for many site owners, these terms will remain from the field of “higher mathematics”, although a competent manager will see the greater cost of advertising space or many other opportunities to increase profits from his online property.
Enumerate types works, which webmasters usually refer to as internal site optimization:
I think , now readers have no doubts that carrying out internal site optimization is not inferior in importance to optimizing the external link network. We will discuss further the methods and one of the tools for optimizing the internal link structure.
In this section, we will consider only those methods that, if successfully implemented, will help us to achieve two main goals - to improve the visibility of significant pages of the site by their keywords and to improve quality of use of the site (which will result in increased marketing attractiveness of the site) .
This method is successfully used by many bloggers, the benefit is that plug-ins to display a list of top posts there is enough. By organizing a block with the most significant, interesting or commented materials, we give visitors the opportunity to familiarize themselves with the best content that is of interest to the majority of users of the site, and also increase the number of internal links to the most important materials, which makes these pages more significant in the "eyes" of search engines. For best results, a block with links to top materials is better done closer to the top of the sidebar. If you decide to add such a block on your sites, then for many common CMS you will You will find plug-ins that implement this functionality.
After the user has read the material and comments to him, he may need additional information on what he has just read or on thematically related topics. If the reader doesn’t have such a need, then the list of materials on the topic will help interest the reader and prevent him from “fleeing” from the site. The list of materials on the topic is placed immediately after the article or in the sidebar near the end of the article / post. Recently, many resources in the list of relevant materials, along with the article title, also use a small image - it is good draws the user's attention seeking "that would still be read here." Thus, this technique, similar to the previous one, besides improving the distribution of the internal reference weight, helps to reduce the number of failures and increase the number of pages viewed and the time spent on the site. In plug-in catalogs of common CMS, you can easily find plug-ins to implement a list of relevant materials.
First, let's look at the difference between a semantic cloud and a tag cloud?
A tag cloud is composed of relevant words assigned to the site materials by their authors . Links from the tag cloud lead to an intermediate page with a list of materials for this tag, since the same tag can be assigned to a large number of materials. The physical size of the tag cloud element (that is, in fact, the size of the font for a particular tag in the cloud) depends on the number of materials associated with the tag and does not reflect the interest of users in the materials on this tag, but only indicates that the author of the resource more interesting topics related to this tag. Tag clouds increase the amount of duplicate content, i.e. a situation is not uncommon when the same article or post is available at different URLs. Therefore, in order to avoid a ban from search engines for using techniques that increase the amount of duplicate content on a site, it is necessary to close tag clouds, category lists and other navigation blocks from indexing, which allow receiving content through URLs that are not URLs of the actual content.
With the semantic cloud, the situation is just the opposite - it is based on the queries that users use to come from search engines, i.e., these are real keywords that users find pages on your site that reflect the actual interest of users to this or that topic. Links from the semantic cloud lead directly to the target, not to an intermediate page, i.e., the URL of the final page with content is used for the link in the semantic cloud. Thus, a keyword cloud with links can be a very useful tool both in terms of search engine optimization and in terms of usability optimization.
Speaking of link clouds, it is important to note that too many internal links from one page may not bring the expected results or even worsen the situation, since in this case the page may be perceived by the search engine as “link washer”. There is no general rule about “the maximum number of internal links from a page”, but it is advisable not to make more than 40-50 internal links to a page. If we consider that a certain number of internal links will go to service purposes - menus, plugins, etc., then the link cloud will remain content with a modest number of 10-20 links, which is also very good.
From the descriptions of the previous two methods, we know that for each of the common CMS there are a number of plug-ins that implement these techniques in practice, but the plug-ins for creating a semantic cloud are not yet so developed. However, third-party developments will help us, one of which I want to tell.
So, no matter what CMS is installed on your site, even if it is a CMS written by Vasya Pupkin in PHP in this case, HTracer will help to improve the internal optimization of the site, and therefore get more traffic from search engines.
We have already identified several main types of work on internal optimization.HTracer will not perform server performance optimization or CMS template layout optimization for you, but with the help of it you will be able to:
I note that the list of HTracer features is Tavlya descending important functions in terms of search engine optimization. And the first 2 places in this list are occupied by the most significant functions, which you should pay attention first of all - these are the capabilities of the script for automating internal linking of the site pages.
with all this “crap”. So what should be done to make these magical semantic clouds and blocks of links to internal pages work on your website? It is clear - to buy, install and configure the script. Installing the script is described in detail by the author and should not cause problems.
First of all, after installing the script, you need to create a semantic core. This can be done in several ways, ranging from manually entering and importing keys from text files and ending with more advanced methods — importing data from Google Analytics and LiveInternet statistics systems. In addition, during its work on the site, HTracer detects transitions from search engines and adds new requests to the semantic core, keeping it up to date.
When adding words by importing from analytics systems or using live monitoring of incoming traffic, obscene (matte) and specific keys are filtered (which is meant by “specific keys” the script author does not indicate).
The function of automatic import of keywords from analytics systems caused some questions in my work, because when importing there were not quite clear queries, which in theory should not be for these pages. Therefore, for small Sites I have stopped at the option with importing requests from text files, where all requests are selected and filtered by pens. This is more labor-intensive and less automated, but it seemed to me more effective due to the absence of parasitic phrases. On the other hand, on sites with tens of thousands of requests and the same number of pages , such a method will probably not work even after the script is compiled automatically by the semantic core, you will probably need manual “sifting” of parasitic queries.
In the settings of the script, additional filters are provided, for example the filter "Filter services" will ignore transitions with the words "rent", "repair", etc., the filter "Free words" - transitions with the words "free", "crack" , “keygen” and others, and “Sex-filter” (oh, what a charming name) - “will ignore transitions with censorship words related to sex, for example,“ oral ”,“ vagina ”, etc.”. What words hiding behind the meaningful "dr. ”In the description of the filters is not clear. In my opinion, it would be nice to be able to see the complete lists of stop words for each filter, and the possibility of editing these lists would also be useful. In addition, among the features of the script, I would like to see the possibility of organizing my own filters with my own lists of stop words.
Well, two steps separate us from the appearance of beautiful and useful keyword clouds on the site - setting the look of the cloud and setting the place for inserting the cloud. It looks clear - if we need a keyword cloud, where the size of the cloud element (keyword) will mean user interest in this request, then in the settings on the Cloud tab, we’ll select the Cloud style - Cloud, if you want to display a block of links in a list (perhaps you want to apply your own styles), then use one of the three suggested options - UL, OL lists or the usual “pseudo-list” organized with the help of the line break tag BR.
Now we set the number of links in the cloud. Remember, above we discuss ali number of internal links from the page and that the cloud remains modest 10-20 links? Well, for obvious reasons, it is best not to go too far and do not use proposed by the author, "to infinity", 10-20 is quite enough. Everything is good in moderation, someone said there ...
Accident is another setting for the cloud, on which depends how diverse the set of key phrases in the cloud will be on different pages of the site. It is difficult to advise here, but if there are many keywords in the semantic core, then I would increase this value.
The following two settings are the minimum and maximum size of an element in the cloud, that is, in fact, the size is the least and most interesting to users of keywords in a particular cloud set. Here the following thought came to my mind - I wonder if the script handles keyword clicks in the cloud? Indeed, in fact, clicks made by keywords in the cloud also reflect user interest in certain requests. Thus, in the formula for calculating the parameters of keywords, it is possible to lay the interest of users in keywords, compiled on the basis of statistics of clicks on keywords from the cloud (Achtung! Feature is detected!).
Now the last step is left - tell the script where to insert the cloud on the page. In order not to make any changes to the templates, simply select the CSS selector and the script will insert the cloud in the right place. Moving the cloud to another place on the page will not be any problem - just change the CSS selector in the cloud settings.
So, we save the settings and look at the result in the browser. If you liked the type, location and content of the semantic cloud, then accept congratulations and start monitoring search engine statistics to improve the site's positions on the target MF and LF queries. I think that in a couple of weeks you will see the first changes in ranking.
If you didn’t like the view or location of the cloud, then try playing with the settings of its view. If you didn’t like the contents of the cloud, then you need to check, and perhaps correct, the semantic core that the script compiled.
The admin panel provides the ability to control the view and location of only one cloud, the so-called default clouds (i.e. the only beautiful cloud that is made for users). If you want to place more than one cloud (for example, an additional in the footer), but also differ in default from the cloud, you need to use the function “get_keys_cloud ();” calling it with a set of parameters (which are described in detail in the documentation ), which is not always convenient. I would still add to the TODO script development the ability to add clouds and steer them by view and placement directly from the admin. Agree that it is much more pleasant to do a couple of dozen mouse clicks in a beautiful interface, than to stick functions directly into code it, foul, if something did not work and rewrite the parameters of the function call.
The second important feature of HTracer, which I would advise to master - this is the insertion of contextual links. I would give the following recommendation on the use of this function - 1-2 contextual links to an article of 2-3 thousand characters will be enough. But since in the settings you can only set the maximum number of contextual links on one page, I would first determine the average amount of articles on the site and based on this would set the required number of links to the page.
Be sure to use the Alt insertion function for images and links for links, this does not affect as much as using keywords in page titles or title tags, but nevertheless there is some value in terms of search engine optimization .
I use the functions of highlighting keywords in bold text and putting down meta-keywords at your own discretion.
From my point of view, the use of selection only worsens the readability of the text. Although it is possible to distort the information and change the appearance of the strong tag through styles, that is, important keywords for the search engines will be highlighted, and users will see the “refined” text. Setting the frequency for selecting keywords is implemented in the script quite simply - the first entry or all, thus, if I used this function, I would select only the first occurrence, since the selection of all keywords in bold would be too “fat” and “some” might not like it. 🙂
Putting keywords into meta- keywords i think are too little effective compared with the above methods, therefore, I would use this opportunity, as they say, “to the heap”, but it is not necessary to wait for any tangible increase in positions from its use.
I tried to highlight the most significant methods of internal optimization and the opportunities provided by the HTracer script in terms of the “efficiency obtained by the elapsed time.”
Although the main the tools that implement the methods described in this article are automated, and you spend most of your time setting up these scripts and plug-ins, do not count on changing positions in the tops immediately after installation and configuration.Let HTracer work with the compiled semantic core, collect new statistics on real transitions, audit the compiled semantic core, correct it if necessary. Users also need time to get used to and work with new elements for them on the pages of your site (top posts, list of subject posts, query cloud).
Search engines are no exception, in order to otduplite the important optimizations that you made on the site, they will also need some time.
The significant changes in search positions and marketing parameters (number of refusals, number of page views and time spent on the site) will come no earlier than 3-4 weeks after the implementation of the techniques and tools described here.
Good luck in conquering low-frequency tops and increasing the user appeal of your sites!