Capture a niche and top search results - Profit Hunter

Capture a niche and top search results - Profit Hunter This is a translation of the first of the three posts of the author of the blog BlueHatSeo about how he manages to build a niche empire and capture the issue on competitive key phrases.

Eli is a true techie, so his approaches and earning strategies on the Internet in their pure form will not suit everyone (for example, me, the humanities, they definitely don’t fit). On the other hand, it is unlikely that anyone will apply them in their pure form, since any method is desirable to use in the context of a specific niche and a specific site.

The translation was made as is, without any additions or comments on my part. (The next two articles will be posted soon.)


In this post, there will be no fawn and no secret techniques - I just want to share with you the strategy which allows me to seriously and permanently capture virtually any niche that I undertake. This post cannot be considered a step-by-step guide: it rather reflects the general essence of my strategy, although it does touch on some essential details. The implementation of this strategy requires a lot of effort, so it does not suit everyone. I do not belong to the number of lazy marketers who work several hours a week. I are not such . My working week - is 60-70 hours without respite . Those to whom such a work schedule does not fit, do not convict .

Every time I want to promote my resources in a medium or highly competitive niche, I carefully study the current issue. Ideally, search engines should rank sites for their quality and relevance. In reality, they pay more attention to the authority and thematic proximity of the site. It would seem that this is almost the same, but this is the same as saying “a person with good habits” and “a good person”. The difference is barely perceptible, but it is precisely this difference that gives us - those who did not manage to register a domain in the distant 1998 - a chance. Considering that even the best algorithms cannot build the perfect output, I can say that my sites do not need to be super-large and megatrast, no - they just need to be a little more than the rest.

Many people try to conquer a niche with a swoop. I also came across this bait many times. I created the site, selected several key phrases and built, built, built, until I ran into a blank wall. Older sites have a margin of safety that accumulates fast enough to deter young competitors. You can play months and years in these rat race, but in most cases there will be one result - disappointment. But this is not reason to lower hands . There is a way out of this situation. But in order to find it, you need to soberly assess the situation in which you find yourself.

For example, let's take a hypothetical niche, in which among the first 15 sites in issue there are 4-5 large old niche communities, several informational sites with a huge number of pages and backlinks (7 k -45 k ) and a couple of pages of "about anything and everything" a la Wikipedia , About . com , Amazon , etc. The average age of domains from such a hypothetical issue is 6 years. The range of age : from 5 to 10 years .

There is a lot of work to do. But first you need to do a little analysis and decide what we need. The presence of the pages "about anything and everything" suggests that the factor of "thematic proximity" plays not the most important role, because in this case Google puts the authority of the domain and the writing of the botan -moderator is higher than the relevance and relevance of the whole site. However, I will not discard the topic, because it will be the last nail that I will roll into the lid of the coffin of my competitors.

However, anyway, my competitors are serious. Even if I create a huge website and send several thousand backlinks to it, I still have to butt with the authorities for a place in the sun for at least 3-4 years. But I need the money yesterday, so you need to find a way to earn enough credibility in a shorter time.

This can be done in different ways. Here's a rhetorical question-hint: what will remove garbage faster: one big bulldozer or 5 small ones?

Did you notice a catch? This question is incorrect: it's not bulldozers - it's garbage. In this case, I can not use one powerful site. I'm just not able to overcome the old authoritative sites. Therefore, I have nothing left but to take them by quantity. Yes, in my thoughts, I will create for myself the image of a large multi-faceted site, which would undoubtedly become a leader in a niche, if not for such competition. But given the current situation, I must abandon this idea and revise the structure of the project.

First of all, I have to divide the “ideal site” into basic blocks. If, for example, the site is supposed to have a forum, I will make it a separate site. If a blog is supposed to exist, I will separate it too. In the same way I will do with articles, news, shop and all the rest. As a result, I get a mini-network of narrow-topic sites. For our example, let us assume that as a result, 15-20 medium-sized sites are exiled, interlinked among themselves and forming a dense thematic mini-grid. For each site I will take a separate template and create several pages of content. If some sites do not have enough content, I will take one of the sections, break it and share it between sites. With this all. Now you need to take care of credibility.

So, each site has a certain authority. Let it be X. If we have 20 sites, their total authority is 20X. If we send a link back to the 1st site, its credibility will grow to X + 1. Since the sites are perelinkovany between themselves, the additional authority is evenly distributed among all of them. Consequently, if one of the sites rises in the issue, it will automatically pull others along. This is where the synergy effect works - the total authority of the network of sites exceeds the sum of the authority of individual sites. Perhaps you believe that the authority of each site is an indicator that characterizes the relative weight of the site in comparison with other sites. If this were the case, the total authority of the sites would be zero, and the authority of half of them would be below zero. But this is not true, right? Therefore, if the authority of one site in the network grows, the authority of another site, at least, remains unchanged. It does not decrease .

Now I need to save backlinks. And this is exactly what you need to do first. At first, you will be able to feed a grid of 50-200 links per day at the expense of site directories, counter links, links from social networks, etc. You may even be able to get a few links from reputable niche sites. But sooner or later the rate of building reference weight will slow down. Free supply of free backlinks will be exhausted. You will hit the wall. By this time, each of the sites should have 2-3 thousand inbound links, giving you up to 40 k links for the entire network. And this is something. At this stage, the grid sites should be somewhere around the 20-50th position in the issue. To advance even higher, additional authority is needed. To do this, I will create another mesh, larger than the first.

For the second grid, the main thing is volume. I am not interested in the sites themselves, and I’m not even going to look for backlinks for them. All I need is that search engines index pages from new sites and find links from them. I can also get additional links from my other sites (not necessarily relevant). My goal is to create a grid of blogs based on free platforms on reputable domains. At this stage you will have to work a lot, but the result is worth it. As a result, most sites from the first grid should get to the top 30 of the issue.

After that I will put on the grid a few links from my sites, which are well ranked by the necessary search engines. This will raise the reference weight as separate grid sites and the weight of the grid as a whole. In such cases, I do not put counter-references to third-party sites in order to concentrate the maximum reference weight within the grid. It is even better to send links from third-party sites to a single grid site in order to raise it even more in the issue.

At this stage, my site grid should be located at 7-30 points of issue. If I fail to break through the glass ceiling and promote sites to higher positions, I can arrange a small diversion. So, for example, to shake the positions of the Wikipedia page, I can remove a few outgoing links from it and mark them as spam. I understand that this is contrary to network ethics and that links may return to the site after some time, but ideally a situation should arise when a webmaster, noticing a sharp drop in traffic from Wikipedia , finds that his site has been tagged, as spam, and in a fit of anger will remove the link to Wikipedia .

Sometimes you need to do what you need. And if in order to promote your sites you have to convince the author with About . com , that his article is located in the wrong category, do it. Of course, not going too far. And even more so it is not worth going to the harsh tone in correspondence or private conversation just because you think that your site should stand out above competitors.

Since by this time I had already come close to the top of the issue, I have to prove that my sites are relevant to the subject so as to move the competition even lower.To do this, I will add relevant content to those 2-5 sites that reached the first 15 issuance positions. This is the best time to shake a little niche and provide visitors with a completely unique product, service or service.

By this time, I’ll probably already be known to webmasters and visitors to 3-4 niche forums and communities, so I need to give them a good topic for discussion. In the meantime, I will direct all forces to several advanced sites in order to further enhance their authority. Perhaps for this I will revise the principle of using the tag nofollow .

At the same time, I will begin to “steal” content from those sites on the network that failed to reach the top, and will republish it on the advanced 5-10 sites. Perhaps the most advanced site will need a blog, and I’ll stop updating a single blog that was created at the beginning of the path so often. I will definitely spend some time linkbiting.

When after some time 4-7 sites from my network get into the top 10 of the search results, I will only have to keep the sites in the right condition and cut the coupons. After that, I can safely switch to the next niche.


p. with. After installing the Sitemap-XML plugin, problems occurred with encoding in the comments. I have already deactivated the plugin and deleted it from the folder, but the problem seems to remain. Google search did not give me a solution. If someone from the readers knows how to fix this trouble, I will be very grateful if you unsubscribe in the comments or write a letter to bogdan (at) profithunter. ru.

Added: The previous paragraph is no longer relevant. Glitch fixed.

Related Posts:

  • Search for unique content
  • Backlinks, backlinks, backlinks ...
  • Linkbeiting - Separating the wheat from the chaff
  • Day 29 - Part 1 - Registration in catalogs

Do you like articles? Subscribe to the newsletter!


Related Articles