Sales Generator

Reading time: 14 minutes

We will send the material to you:

From this article you will learn:

  • How to make the semantic core of the site
  • What programs to use for this
  • How to analyze the semantic core of a competitor's website
  • What mistakes are most often made in the assembly of the semantic core
  • How much does it cost to order a ready-made semantic core of the site

The semantic core is the basis of any Internet resource, the key to its successful promotion and attracting the target audience. How to create the semantic core of the site and what mistakes to avoid, you will learn from this article.

What is the semantic core of the site

The simplest and yet effective method attract visitors to your site - make sure that they themselves show interest in it by clicking on the link from search engine Yandex or Google. To do this, you need to find out what your target audience is interested in, how, by what words, users are looking for the necessary information. The semantic core will help you with this.

The semantic core is a collection of individual words or phrases that characterize the subject and structure of your site. Semantics - originally - the field of philology, dealing with the meaning of words. Nowadays it is more often understood as the study of meaning in general.

Based on this, we can conclude that the concepts of "semantic core" and "semantic core" are synonyms.

The purpose of creating the semantic core of the site is to fill it with content that is attractive to users. To do this, you need to know what keywords they will look for information posted on your site.


Submit your application

The selection of the semantic core of the site involves the distribution of search queries or groups of queries across pages in such a way that they satisfy the target audience as much as possible.

This can be achieved in two ways. The first is to analyze the search phrases of users and, based on them, create a site structure. The second way is to first come up with a framework for the future site, and then, after analysis, distribute keywords over it.

Each method has the right to exist, but the second one is more logical: first you create the site structure, and then fill it with search queries, for which potential clients will be able to find the content they need through search engines.

This is how you show the quality of proactivity - you independently determine what information to convey to site visitors. Otherwise, creating a site structure based on keywords, you only adapt to the surrounding reality.

There is a fundamental difference between the approach to creating the semantic core of the site of an SEO specialist and a marketer.

A classic optimizer will tell you: to create a website, you need to select phrases and words from search queries for which you can get to the TOP of search results. Then, on their basis, form the structure of the future site and distribute the keywords across the pages. Page content is created for the selected keywords.

A marketer or entrepreneur will approach the issue of creating a website differently. First, he will think about what the site is for, what information it will carry to users. Then he will come up with an approximate structure of the site and a list of pages. At the next stage, he will create the semantic core of the site in order to understand what search queries potential customers are looking for information on.

What are the disadvantages of working with the semantic core from the position of an SEO specialist? First of all, with this approach, the quality of information on the site is significantly deteriorating.

The company should decide for itself what to say to customers, and not give out content in response to the most popular search queries. Such blind optimization can lead to the fact that some of the promising queries with low frequency rates are eliminated.

The result of creating a semantic core is a list of keywords that are distributed across the pages of the site. This list indicates the URL of the pages, keywords and the level of frequency of their requests.

An example of the semantic core of the site

How to compose the semantic core of the site: step by step instructions

Step 1. Compile the initial list of requests

First you need to select the most popular search queries on the subject of your site. There are two options for how to do this:

1. Brainstorming method- when for a short period of time you yourself or with colleagues write down all the words and phrases by which, in your opinion, users will search for information posted on your site.

Write down all possible options, including:

  • variations in the spelling of the name of a product or service, synonymous words, ways of writing the name in Latin and Cyrillic letters;
  • full names and abbreviations;
  • slang words;
  • references to the constituent elements of a product or service, for example, building materials - sand, brick, corrugated board, putty, etc.;
  • adjectives that reflect significant characteristics of a product or service (quality repairs, fast delivery, painless dental treatment).

2. Analyze the sites of your competitors. Open an incognito browser for your region. Look at the websites of competitors that will be shown to you by the search results for your topic. Find all potential keywords. You can determine the semantic core of a competitor's website using the com and bukvarix.com services.

Analyze contextual advertisements. On your own or with the help of specialized services (for example, spywords.ru or advodka.com), study the semantic core of someone else's site and find out which keywords competitors use.

By applying all three approaches, you will get a fairly large list of keywords. But it will still not be enough to create an effective semantic core.

Step 2. Expanding the resulting list

At this stage, Yandex.Wordstat services and Google AdWords. If he takes turns entering words from your list of keys generated at the first stage into the search string of any of these services, then at the output you will get a list of refined and associative search queries.

Refined queries can include other words or phrases in addition to your word. For example, if you enter the keyword "dog", then the service will give you 11,115,538 queries with this word, which include such queries for the last month as "photos of dogs", "treatment of dogs", "breeds of dogs", etc.


Association queries are the words or phrases that users searched for along with your query. For example, along with the keyword “dog”, users entered: “dry food”, “royal canin”, “Tibetan mastiff”, etc. These search queries can also be useful to you.


In addition, there are special programs for creating the semantic core of the site, for example: KeyCollector, SlovoEB and online services - Topvisor, serpstat.com, etc. They allow not only to select keywords, but also to analyze them and group search queries.

To expand the list of keys as much as possible, see what the service's search suggestions show. There you will find the most popular search terms that start with the same letters or words as yours.

Step 3. Remove unnecessary requests

Search queries can be classified in different ways. Depending on the frequency, requests are:

  • high-frequency (more than 1500 requests per month);
  • mid-frequency (600-1500 requests per month);
  • low-frequency (100-200 requests per month).

This classification is highly arbitrary. Assigning a request to one category or another will depend on the subject of a particular site.

AT last years there is an upward trend in indicators of low-frequency queries. Therefore, to promote the site, the semantic core should include mid- and low-frequency queries.

There is less competition among them, so it will be much easier to raise the site to the first page of search results than when working with high-frequency queries. In addition, many search engines welcome when sites use low-frequency keywords.

Another classification of search queries is by search objectives:

  1. Informational- Key words that users enter in search of specific information. For example: “how to glue the tiles in the bathroom yourself”, “how to connect the dishwasher”.
  2. Transactional- keywords that users enter when planning to perform some kind of action. For example: “watch a movie online for free”, “download a game”, “buy building materials”.
  3. vital- queries that users enter in search of a specific site. For example: "Sberbank online", "buy a refrigerator on Yandex.Market", "vacancies on Head hunters".
  4. Other (general)- all other search queries by which you can understand what the user is looking for. For example, the query "car" the user can enter if he wants to sell, buy or repair a car.

Now it's time to remove from the list of keywords all unnecessary ones that:

  • do not correspond to the theme of your site;
  • include competitor brand names;
  • include the names of other regions (for example, buy an iPhone in Moscow if your site works only for Western Siberia);
  • contain typos or errors (if you write “dog” instead of “dog” in the search engine, it will consider this as a separate search query).

Step 4. Define competitive requests

To effectively distribute keywords on the pages of the site, you need to filter them by importance. To do this, use the Keyword Effectiveness Index - KEI (Keyword Effectiveness Index). Calculation formula:

KEI = P2/C,

where P is the frequency of impressions of the keyword in the last month; C - the number of sites that are optimized for this search query.

The formula shows that the more popular the keyword, the higher the KEI, the more targeted traffic you will attract to your site. High competition for a search query makes it difficult to promote a site on it, which is reflected in the KEI value.

Thus, the higher the KEI, the more popular the search query, and vice versa: the lower the keyword performance index, the higher the competition for it.

There is a simplified version of this formula:

KEI \u003d P 2 /U,

where instead of C, the indicator U is used - the number of pages optimized for this keyword.

Let's look at an example of how to use the Keyword Effectiveness Index (KEI). Let's determine the frequency of requests using the Yandex Wordstat service:


At the next step, let's see how many pages are in the search results for the search query we are interested in for the last month.


Substitute the found values ​​of the variables into the formula and calculate the keyword effectiveness index KEI:

KEI = (206 146 * 206 146) / 70 000 000 = 607

How to evaluate KEI values:

  • if KEI is less than 10, then search queries are ineffective;
  • if KEI is from 10 to 100, then search queries are effective, they will attract the target audience to the site;
  • if KEI is from 100 to 400, then search queries are very effective, they will attract a significant share of traffic;
  • with a KEI of more than 400, search queries have maximum efficiency and will attract a huge number of users.

Keep in mind that the gradation of KEI keyword performance index values ​​is determined by the theme of the site. Therefore, the above scale of values ​​cannot be applied to all Internet resources, since for some the value of KEI > 400 may be insufficient, and for highly specialized sites this classification is not applicable at all.

Step 5. Group keywords on the site

Clustering the semantic core of the site is a process of grouping search queries for logical reasons and based on the results of search engines. Before proceeding with the grouping, it is important to make sure that the specialist who will carry it out understands all the intricacies of the company and product, knows their specifics.

This work is expensive, especially when it comes to filling a multi-page Internet resource. But it doesn't have to be done by hand. You can cluster the semantic core of the site automatically using special services, such as Topvisor, Seranking.ru, etc.

But it is better to double-check the results obtained, since the logic of separating keys into groups for programs may not coincide with yours. In the end you will get final structure site. Now you will clearly understand which pages you need to create and which ones to eliminate.

When is it necessary to analyze the semantic core of a competitor's website?

  1. When starting a new project.

You are working on a new project and are building the semantic core of the site from scratch. To do this, you decided to analyze the keywords that competitors use to promote their sites.

Many are suitable for you, so you use them to replenish the semantic core. It is worth considering the niche in which competitors operate. If you plan to occupy a small market share, and competitors operate at the federal level, then you cannot just take and completely copy their semantic core.

  1. When expanding the semantic core of a working site.

Do you have a website that needs to be promoted? The semantic core was formed a long time ago, but it works inefficiently. Requires site optimization, restructuring, updating content in order to increase traffic. Where to start?

First of all, you can analyze the semantic core on competing sites using specialized services.

How to use keywords from competitor sites in the most effective way?

Here are some easy rules. First, take into account the percentage of matches for keys from your site and from other people's Internet resources. If your site is still under development, then choose any competing site, analyze it and use keywords as the basis for creating your semantic core.

In the future, you will simply compare how much your reference keys intersect with keys from competitor sites. The easiest way is to use the service to download a list of all competing sites and filter them by the percentage of intersections.

Then you need to download the semantic cores of the first few sites in Excel program or Key Collector and add new keywords to the semantic core of your site.

Secondly, before copying the keys from the donor site, be sure to visually check it.

  1. When buying a ready-made site for the purpose of subsequent development or resale.

Consider an example: you want to buy a certain site, but before making a final decision, you need to evaluate its potential. The easiest way to do this is to study the semantic core, so you can compare the current coverage of the site with competitors' sites.

Take the strongest competitor as a benchmark and compare its visibility with the results of the Internet resource that you plan to purchase. If the gap from the reference site is significant, this is a good sign: it means that your site has the potential to expand the semantic core and attract new traffic.

Pros and cons of analyzing the semantic core of competitors through special services

The principle of operation of many services for determining keywords on other people's sites is as follows:

  • a list of the most popular search queries is formed;
  • for each key, 1-10 search results pages (SERPs) are selected;
  • such collection of key phrases is repeated with a certain frequency (weekly, monthly or every year).

Disadvantages of this approach:

  • services issue only the visible part of search queries on the websites of competing organizations;
  • services retain a kind of "cast" of the issuance created during the collection of keywords;
  • services can determine the visibility of only those search queries that are in their databases;
  • services show only those keywords that they know.
  • to get reliable data about keywords on a competing site, you need to know when search queries were collected (visibility analysis);
  • not all requests are reflected in the search results, so the service does not see them. The reasons may be different: the pages of the site have not yet been indexed, the search engine does not rank the pages due to the fact that they take a long time to load, contain viruses, etc.;
  • usually there is no information about which keys are included in the base of the service used to collect search results.

Thus, the service does not form a real semantic core that underlies the site, but only a small visible part of it.

Based on the foregoing, the following conclusions can be drawn:

  1. The semantic core of the competitor's website, formed with the help of special services, does not give a complete up-to-date picture.
  2. Checking the semantic core of a competitor's site helps to complement the semantics of your Internet resource or analyze the marketing policy of competing companies.
  3. The larger the keyword base of the service, the slower the process of processing the issuance and the lower the level of relevance of semantics. While the service collects search results at the beginning of the database, the data at the end of the database becomes obsolete.
  4. Services do not disclose information about the degree of relevance of their databases and the date of the last update. Therefore, you cannot know to what extent the keywords selected by the service from a competitor's site reflect its real semantic core.
  5. A significant advantage of this approach is that you get access to a large list of competitor keywords, many of which you can use to expand the semantic core of your site.

TOP 3 paid services where you can find out the semantic core of competitors

Megaindex Premium Analytics


This service has a rich arsenal for analyzing the semantic core of competing sites. Using the module Site Visibility you can find and download a list of keywords, identify sites with a similar theme that can be used to expand the semantic core of your site.

One of the disadvantages of Megaindex Premium Analytics is the inability to filter the lists of keys in the program itself, you first need to download them in Excel.

Brief description of the service:

Keys.so


In order to analyze the semantic core using the keys.so service, you need to insert the url of a competitor site, select suitable sites based on the number of matching key phrases, analyze them and download a list of search queries for which they are promoted. The service makes it easy and simple. Nice bonus - modern interface programs.

Cons: small size of the database of search phrases, insufficient frequency of visibility updates.

Brief summary of the service:

Spywords


This service not only analyzes visibility, but also provides statistics on advertisements in Yandex.Direct. At first, it is difficult to deal with the spywords.ru interface, it is overloaded with functionality, but in general it does its job well.

With the help of the service, you can analyze competing sites, identify intersections in key phrases, and upload a list of competitor keys. The main disadvantage is the insufficient base of the service (about 23 million search phrases).

Brief summary of the service:

Thanks to special programs sites and their semantic cores are no longer a mystery to you. You can easily analyze any Internet resources of competitors you are interested in. Here are some tips for using the information you receive:

  1. Use keywords only from sites with similar topics(the more intersections with yours, the better).
  2. Do not analyze portals, they have too large semantic cores. As a result, you will not supplement your own core, but only expand it. And this, as you already know, can be done endlessly.
  3. When buying a site, be guided by the indicators of its current visibility in the search engine, compare them with the sites included in the TOP to assess the development potential.
  4. Take keywords from competitor sites to complement the semantic core of your site, rather than building it from scratch.
  5. The larger the base of the service you use, the more complete your semantic core will be. But pay attention to the frequency of updating search phrase databases.

7 services that will help you create the semantic core of the site from scratch online

Google Keyword Planner


If you are thinking about how to create a semantic core of a site, pay attention to this service. It can be used not only in Runet, but also in other segments where AdWords works.

Open Google AdWords. AT top panel In chapter "Tools" click on option Keyword Planner. A new menu will appear in which you need to select a section "Search for new keywords by phrase, site or category." Here you can configure the following settings:

  • the keyword or phrase to search for;
  • subject matter of the product or service;
  • region of search queries;
  • the language in which users enter search queries;
  • keyword search engine;
  • negative keywords (should not be present in keywords).

Next, click on the button "Get Options" after which Google AdWords will give you possible synonyms for your keyword or phrase. The received data can be uploaded to Google Docs or CSV.

Benefits of using Google AdWords service:

  • the ability to select synonyms for the key phrase;
  • use of negative keywords to refine the search query;
  • access to a huge database of search queries Google systems.

The main disadvantage of the service: if you have a free account, then Google AdWords will provide inaccurate data on the frequency of search queries. The error is so significant that it is impossible to rely on these indicators when promoting the site. The way out is to buy access to a paid account or use another service.

Serpstat


This service allows you to comprehensively collect user search queries by keywords and site domains. Serplast is constantly expanding the number of region bases.

The service allows you to identify key competitors your site, determine the search phrases by which they are promoted, and form a list of them for subsequent use in the semantic core of your Internet resource.

Benefits of the Serplast service:

  • a large selection of tools for analyzing the semantic core of competitor sites;
  • informative reporting forms reflecting the frequency indicators for the selected region;
  • option to upload search queries for specific pages of the site.

Cons of the Serplast service:

  • despite the fact that the service database data is constantly updated, there is no guarantee that realistic data on the frequency of search queries will be provided between the latest updates;
  • not all search phrases with low frequency are displayed by the service;
  • limited languages ​​and countries with which the service works.

Key Collector


This service will help you deal not only with the question of how to assemble the semantic core of the site, but also solve the problem of its expansion, cleaning and clustering. Key Collector is able to collect search queries, provide data on the level of their frequency in selected regions, and process semantics.

The program searches for key phrases by start lists. It can be used to work with databases of various formats.

Key Collector can show the frequency of keywords from data downloaded from Serpstat, Yandex Wordstat and other services.

Semrush


Compiling the semantic core of the site in the Semrush program will cost you absolutely free. But you won't get more than 10 keywords with data on their frequency in the selected region. In addition, using the service, you can find out what other search queries users in other regions enter for your keyword.

Advantages of the Semrush service:

  • works all over the world, it is possible to collect data on the frequency of search queries in the western region;
  • for each key phrase gives the TOP sites in the search results. You can be guided by them in the future, when forming the semantic core of your own site.

Cons of the Semrush service:

  • if you want to get more than 10 keywords, you need to purchase a paid version for $100;
  • it is not possible to download the complete list of key phrases.

keyword tool


This service allows you to collect key phrases for the semantic core of the site from foreign Internet resources in broad correspondence. Keywordtool also allows you to select search suggestions and phrases that contain the base key.

If you use the free version of the program, then in one session you can get no more than 1000 search phrases without data on their frequency level.

Advantages of the Keywordtool service:

  • works with different languages ​​and in many countries of the world;
  • shows search queries not only from search engines, but also from popular online stores (Amazon, eBay, App Store) and the largest video hosting service YouTube;
  • the breadth of coverage of search phrases exceeds that of Google AdWords;
  • the generated list of search queries can be easily copied into a table of any format.

Disadvantages of the Keywordtool service:

  • the free version does not provide data on the frequency of search queries;
  • there is no way to load keywords at once as a list;
  • searches for keywords only by phrases in which they can be included, does not take into account possible synonyms

Ubersuggest


The semantic core of the site in the Ubersuggest service can be created based on the search queries of users from almost any country in the world in any language. If you use the free version, you can get up to 750 search phrases per query.

The advantage of the service is the ability to sort the list of keywords in alphabetical order, taking into account the prefix. All search queries are automatically grouped, which makes it easier to work with them when forming the semantic core of the site.

As a disadvantage of Ubersuggest, one can single out incorrect data on the frequency of search queries in free version programs and the inability to search by keyword synonyms.

Ahrefs Keywords Explorer


This service can collect keywords for your semantic core in broad, phrase and exact matches in the selected region, taking into account the frequency level.

There is an option to select negative keywords and view the TOP search results in Google for your main keywords.

The main disadvantages of Ahrefs Keywords Explorer are only the paid version of the program and the dependence of data accuracy on the degree of relevance of the databases.

Frequently asked questions on compiling the semantic core of the site

  • How many keys are enough to create the semantic core of the site (100, 1000, 100,000)?

This question cannot be answered unambiguously. It all depends on the specifics of the site, its structure, and the actions of competitors. The optimal number of keys is determined individually.

  • Is it worth using ready bases key phrases to form the semantic core of the site?

On the Internet you can find many different resources with thematic databases of keys. For example, Base Pastukhov, UP Base, Mutagen, KeyBooster, etc. It cannot be said that you should not use such sources. Such databases contain significant archives of search queries that will be useful to you for website promotion.

But remember about such indicators as competitiveness and relevance of keys. Also keep in mind that your competitors can also use ready-made databases. Another disadvantage of such sources is the likelihood of missing key phrases that are meaningful to you.

  • How to use the semantic core of the site?

Key phrases selected to create a semantic core are used to compile a relevance map. It includes title, description tags and h1-h6 headings that are needed to promote the site. Also, the keys are taken as the basis for writing SEO texts for the site pages.

  • Is it worth taking requests with zero frequency for the semantic core of the site?

This is useful in the following cases:

  1. If you spend a minimum of resources and time to create pages with such keys. For example, generating SEO filter pages in automatic mode in online stores.
  2. Zero frequency is not absolute, that is, at the time of collecting information, the frequency level is zero, but the history of the search engine shows requests for this word or phrase.
  3. Zero frequency only in the selected region, in other regions the frequency level for this key is higher.

5 typical mistakes when collecting a semantic core for a site

  1. Avoid keyword phrases with high competition. After all, this does not oblige you to bring the site to the TOP by this key at all costs. You can use such a search phrase as an addition to the semantic core, as a content idea.
  2. Refusal to use keys with low frequency. You can also use similar search terms as content ideas.
  3. Creation of web pages for individual search queries. Surely you have seen sites where similar queries (for example, “buy a wedding cake” and “make a wedding cake to order”) have their own page. But the user who enters these requests actually wants the same thing. There is no point in making multiple pages.
  4. Create the semantic core of the site exclusively with the help of services. Of course, collecting keys automatically makes life easier. But their value will be minimal if you do not analyze the result. After all, only you understand the features of the industry, the level of competition and know everything about the events of your company.
  5. Over-focus on collecting keys. If you have a small site, then start by collecting semantics using Yandex or Google services. You should not immediately engage in the analysis of the semantic core on competitor sites or collect keys from different search engines. All of these methods will come in handy when you realize that it's time to expand the kernel.

Or maybe it is better to order the compilation of the semantic core of the site?

You can try to create a semantic core on your own using the free services that we have talked about. For example, "Keyword Planner by Google" can give you a good result. But if you are interested in creating a large, high-quality semantic core, plan this item in your budget.

On average, the development of the semantic core of the site will cost from 30 to 70 thousand rubles. As you remember, the final price depends on the subject of the business and the optimal number of search queries.

Not to buy a pig in a poke

A high-quality semantic core will not be cheap. To make sure that the performer understands this work and will do everything at a high level, ask him to collect trial semantics for one request. This is usually done free of charge.

To check the results, run the list of keys through Mutagen and analyze how many of them are high-frequency and low-competitive. It is not uncommon for performers to provide lists with a large number of key phrases, many of which are completely unsuitable for further use.


Array ( => 21 [~ID] => 21 => 09/28/2019 13:01:03 [~TIMESTAMP_X] => 09/28/2019 13:01:03 => 1 [~MODIFIED_BY] => 1 => 09/21. 2019 10:35:17 [~DATE_CREATE] => 09/21/2019 10:35:17 => 1 [~CREATED_BY] => 1 => 6 [~IBLOCK_ID] => 6 => [~IBLOCK_SECTION_ID] => => Y [~ACTIVE] => Y => Y [~GLOBAL_ACTIVE] => Y => 500 [~SORT] => 500 => Articles by Dmitry Svistunov [~NAME] => Articles by Dmitry Svistunov => 11076 [~PICTURE] = > 11076 => 7 [~LEFT_MARGIN] => 7 => 8 [~RIGHT_MARGIN] => 8 => 1 [~DEPTH_LEVEL] => 1 => Dmitry Svistunov [~DESCRIPTION] => Dmitry Svistunov => text [~DESCRIPTION_TYPE ] => text => Articles by Dmitry Svistunov Dmitry Svistunov [~SEARCHABLE_CONTENT] => Articles by Dmitry Svistunov Dmitry Svistunov => statyi-dmitriya-svistunova [~CODE] => statyi-dmitriya-svistunova => [~XML_ID] => => [~TMP_ID] => => [~DETAIL_PICTURE] => => [~SOCNET_GROUP_ID] => => /blog/index.php?ID=6 [~LIST_PAGE_URL] => /blog/index.php?ID=6 => /blog/list.php? SECTION_ID=21 [~SECTION_PAGE_URL] => /blog/list.php?SECTION_ID=21 => blog [~IBLOCK_TYPE_ID] => blog => blog [~IBLOCK_CODE] => blog => [~IBLOCK_EXTERNAL_ID] => => [ ~EXTERNAL_ID] =>)

Many web editions and publications talk about the importance of the semantic core.

There are similar texts on our Chto Delat website. In this case, only the general theoretical part of the issue is often mentioned, while the practice remains unclear.

All experienced webmasters say that you need to form the basis for promotion, but only a few explain how to use it in practice. To remove the veil of secrecy from this issue, we decided to highlight the practical side of using the semantic core.

Why do we need a semantic core

This is, first of all, the basis and plan for further filling and promotion of the site. The semantic basis, divided by the structure of the web resource, is the pointers on the way to the systematic and purposeful development of the site.

If you have such a basis, you do not have to think about the topic of each next article, you just need to follow the list items. With the core, site promotion moves much faster. And the promotion acquires clarity and transparency.

How to use the semantic core in practice

To begin with, it is worth understanding how the semantic basis is generally compiled. In fact, this is a list of key phrases for your future project, supplemented by the frequency of each request.

It will not be difficult to collect such information using the Yandex Wordstat service:

http://wordstat.yandex.ru/

or any other special service or program. In this case, the procedure will be as follows ...

How to make a semantic core in practice

1. Collect in a single file (Exel, Notepad, Word) all queries on your key topic taken from statistics data. Also include phrases “from the head”, that is, logically valid phrases, morphological options (as you yourself would search for your topic), and even options with typos!

2. The list of semantic queries is sorted by frequency. From requests from maximum frequency- to requests with a minimum of popularity.

3. From the semantic basis, all junk queries that do not correspond to the subject or direction of your site are removed and cleaned. For example, if you're teaching people about washing machines for free but not selling them, don't use words like:

  • "buy"
  • "wholesale"
  • "delivery"
  • "order"
  • "cheap"
  • "video" (if there are no videos on the site) ...

Meaning: Do not mislead users! Otherwise, your site will receive a huge number of bounces, which will affect its rankings. And this is important!

4. When the main list is cleared of unnecessary phrases and queries, includes a sufficient number of items, you can use the semantic core in practice.

IMPORTANT: a semantic list can never be considered completely ready and complete. In any subject, you will have to update and supplement the core with new phrases and queries, periodically tracking innovations and changes.

IMPORTANT: the number of articles on the future site will depend on the number of items in the list. Consequently, this will also affect the volume of the necessary content, the working time of the author of the articles, and the duration of filling the resource.

Overlaying the semantic core on the site structure

In order to get a sense out of the entire list received, you need to distribute requests (depending on frequency) according to the structure of the site. It is difficult to name specific figures here, since the scale and frequency difference can be quite significant for different projects.

If, for example, you take a request with a million frequency as a basis, even a phrase with 10,000 requests will seem to be average.

On the other hand, when your main request is 10,000 frequencies, the average frequency will be about 5,000 requests per month. Those. some relativity is taken into account:

"High - Mid - Low" or "High - Mid - Low"

But in any case (even visually) you need to divide the whole core into 3 categories:

  1. high-frequency requests (HF - short phrases with a maximum frequency);
  2. low-frequency queries (LF - rarely requested phrases and phrases with low frequency);
  3. mid-frequency requests (MF - all average requests that are in the middle of your list.

At the next stage, 1 or more (maximum 3) requests are backed up for home page. These phrases should be of the highest possible frequency. High-frequency speakers are placed on the main one!

Further, from the general logic of the semantic core, it is worth highlighting several main key phrases from which sections (categories) of the site will be created. Here you could also use tweeters with a lower frequency than the main one, or better - mid-range requests.

The remaining low-frequency phrases are sorted into categories (under the created sections and categories), turn into topics for future site publications. But it's easier to understand with an example.

EXAMPLE

A good example of using the semantic core in practice:

1. Main page (HF) - high-frequency request - "website promotion".

2. Section pages (SC) - "website promotion to order", "self-promotion", "website promotion with articles", "website promotion with links". Or simply (if adapted for the menu):

Section No. 1 - "to order"
Section number 2 - "on your own"
Section number 3 - "article promotion"
Section No. 4 - “link promotion”

All this is very similar to the data structure in your computer: logical drive (main) - folders (partitions) - files (articles).

3. Pages of articles and publications (NP) - “quick promotion of the site for free”, “promotion to order is cheap”, “how to promote the site with articles”, “promotion of the project on the Internet to order”, “inexpensive promotion of the site with links”, etc. .

In this list, you will have the largest number of various phrases and phrases, according to which you will have to create further site publications.

How to use a ready-made semantic core in practice

Using a query list is an internal content optimization. The secret is to optimize (adjust) each page of the web resource for the corresponding core item. That is, in fact, you take a key phrase and write the most relevant article and page for it. To assess the relevance, a special service will help you, available at the link:

In order to have at least some guidelines in your SEO work, it is better to first check the relevance of sites from the TOP results for specific queries.

For example, if you write text for the low-frequency phrase “inexpensive website promotion with links”, then first just enter it in the search and evaluate the TOP-5 sites in the search results using the relevance assessment service.

If the service showed that sites from the TOP-5 for the query “inexpensive website promotion with links” have relevance from 18% to 30%, then you need to focus on the same percentages. Even better is to create unique text with keywords and about 35-50% relevance. By slightly beating competitors at this stage, you will lay a good foundation for further promotion.

IMPORTANT: the use of the semantic core in practice implies that one phrase corresponds to one unique resource page. The maximum here is 2 requests per article.

The more fully the semantic core is revealed, the more informative your project will be. But if you are not ready for long-term work and thousands of new articles, you do not need to take on wide thematic niches. Even a narrow specialized area, 100% open, will bring more traffic than an unfinished large site.

For example, you could take as the basis of the site not the high-frequency key “site promotion” (where there is tremendous competition), but a phrase with a lower frequency and narrower specialization - “article site promotion” or “link promotion”, but expand this topic to the maximum in all articles of the virtual platform! The effect will be higher.

Useful information for the future

Further use of your semantic core in practice will only consist in:

  • correct and update the list;
  • write optimized texts with high relevance and uniqueness;
  • publish articles on the site (1 request - 1 article);
  • increase the usefulness of the material (edit ready-made texts);
  • improve the quality of articles and the site as a whole, keep an eye on competitors;
  • mark in the kernel list those requests that have already been used;
  • supplement optimization with other internal and external factors (links, usability, design, usefulness, videos, online help tools).

Note: All of the above is a very simplified version of activities. In fact, on the basis of the core, sublevels, deep nested structures, and branches to forums, blogs, and chats can be created. But the principle will always be the same.

GIFT: useful tool to collect the kernel in the Mozilla Firefox browser -

Hi all! Today's article is devoted to how to assemble the semantic core (SN) correctly. If you are engaged in SEO promotion in Google and Yandex, you want to increase natural traffic, increase website traffic and sales, this material is for you.

To get to the bottom of the truth, we will study the topic from "A to Z":

In conclusion, consider general rules for compiling SA. So let's get started!

Semantic core: what is it and what are the queries

The semantic core of the site (also known as the “semantic core”) is a set of words and phrases that exactly matches the structure and subject matter of the resource. Simply put, these are the queries by which users can find a site on the Internet.

It is the correct semantic core that gives search engines and the audience a complete picture of the information presented on the resource.

For example, if a company sells ready-made postcards, then the semantic core should include such queries: “buy a postcard”, “postcard price”, “postcard to order” and the like. But not: “how to make a postcard”, “do-it-yourself postcard”, “homemade postcards”.

Interesting to know: LSI copywriting. Will the methodology replace SEO?

Classification of requests by frequency:

  • High frequency requests(HF) - the most frequently "hammered" in the search bar (for example, "buy a postcard").
  • midrange(MF) - less popular than HF keys, but also of interest to a wide audience ("postcard buy price").
  • Low frequency(LF) - phrases that are requested very rarely ("buy a picture postcard").

It is important to note that there are no clear boundaries separating HF from SL and LF, since they vary depending on the subject matter. For example, for the query "origami", the RF indicator is 600 thousand impressions per month, and for "cosmetics" - 3.5 million.

If we turn to the anatomy of the key, then the treble consists only of the body, the midrange and bass are supplemented by a specifier and a “tail”.

When forming the semantic core, it is necessary to use all types of frequency, but in different proportions: a minimum of high frequencies, a maximum of low frequencies, and an average amount of midrange.

To make it clearer, let's draw an analogy with a tree. The trunk is the most important request on which everything rests. Thick branches closer to the trunk are mid-range keys, also popular, but not as popular as HF. Thin branches are low-frequency words that are also used to search for the desired product / service, but rarely.

Separation of keys by competition:

  • highly competitive (VC);
  • medium competitive (SC);
  • low competitive (NC).

This criterion shows how many web resources this request uses for promotion. Everything is simple here: the higher the competitiveness of the key, the more difficult it is to break through and stay in the top 10 with it. Low-competitive ones are also not worth attention, as they are not very popular on the network. The ideal option is to advance according to the SK requests, with which it is really possible to take first place in a stable business area.

Classification of requests according to user needs:

  • Transactional– keys associated with the action (buy, sell, upload, download).
  • Informational– to obtain any information (what, how, why, how much).
  • Navigational- help to find information on a specific resource (“buy a phone socket”).

The remaining keywords, when it is difficult to understand the user's intention, are assigned to the "Other" group (for example, just the word "postcard" raises a lot of questions: "Buy? Make? Draw?").

Why does a website need a semantic core?

Collecting a semantic core is a painstaking work that requires a lot of time, effort and patience. It will not work in "two accounts" to compose the correct SA that will work.

Here a quite reasonable question arises: is it worth it to spend efforts on selecting a semantic core for a site? If you want your Internet project to be popular, constantly increase the client base and, accordingly, increase the company's profit, the answer is unequivocal: "YES".

Because collecting the semantic core helps:

  • Increase the visibility of a web resource. Search engines Yandex, Google and others will find your site by the keys you choose and offer it to users who are interested in these queries. As a result, the influx of potential customers is growing, and the chances of selling a product / service are increasing.
  • Avoid the mistakes of competitors. When creating a SL, an analysis of the semantic core of competitors that occupy the first position in the search results is necessarily performed. Through the study of leading sites, you can determine what queries help them stay in the top, what topics they write about, what ideas are unsuccessful. During the analysis of competitors, ideas may also arise on how to develop your business.
  • Make the site structure. The semantic core is advised to be used as an "assistant" to create the structure of the site. By collecting the full SA, you can see all the queries that users enter when searching for your product or service. This will help determine the main sections of the resource. Most likely, you will need to make pages that you did not even think about initially. It is important to understand that SL only suggests the interests of users. Ideally, the structure of the site corresponds to the business area and contains content that meets the needs of the audience.
  • Avoid overspam. After analyzing the semantic core of competing sites from the top, you can set the optimal frequency of keywords. Because there is no universal indicator of request density for all pages of a resource, but everything depends on the subject and type of the page, as well as the language and the key itself.

How else can you use the semantic core? To draw up the right content plan. Properly collected keywords will suggest topics for texts and posts that are of interest to your target audience.

Conclusion. It is practically IMPOSSIBLE to create an interesting, popular and profitable Internet project without SA.

Topic material:

Preparing to collect the semantic core for the site

Before creating the semantic core of the site, you need to do the following:

I. Study the activities of the company ("brainstorming")

Here it is important to write out ALL the services and goods that the organization offers. For example, to assemble a semantic core for an online furniture store, you can use the following queries: sofa, armchair, bed, hallway, cabinet + restoration, repair. The main thing here is not to miss anything and not to add too much. Only up-to-date information, ie. if the company does not sell pouffes or repair furniture, these requests are not needed.

In addition to brainstorming, you can use Google services Analytics and Yandex.Metrika (Fig. 1) or personal accounts in Google Search Console and Yandex Webmaster (Fig. 2). They will tell you which queries are most popular with your target audience. Such help is available only to already working sites.

Help texts:

  • Advego- works on the same principle as Istio.com.

  • Simple SEO Tools- a free service for SEO-analysis of the site, including the semantic core.

  • Lenartools. It works simply: load the pages from where you need to “pull out” the keys (max 200), click “Let's go” - and you get a list of words that are most often used on resources.

II. To analyze the semantic core of a competitor site:

  • SEMRUSH- you need to add the address of the resource, select a country, click "Start Now" and get an analysis. The service is paid, but 10 free checks are provided upon registration. Also suitable for collecting keys for your own business project.

  • Searchmetrics- a very convenient tool, but paid and on English language so it's not available to everyone.

  • SpyWords- a service for analyzing the activities of a competitor: budget, search traffic, ads, requests. A “reduced” set of functions is available for free, and for a fee you can get a detailed picture of the promotion of a company of interest.

  • Serpstat– a multifunctional platform that provides a report on keywords, rating, competitors in the search results of Google and Yandex, backlinks and others. Suitable for the selection of SA and analysis of its resource. The only negative is that the full range of services is available after paying for the tariff plan.

Another effective way to expand the semantic core is to use synonyms. Users may search for the same product or service in different ways, so it is important to include all alternative keys in the CL. Tips in Google and Yandex will help you search for synonyms.

Advice. If the site is informational, you first need to select the queries that are the main ones for this resource and for which promotion is planned. And then seasonal. For example, for a web project about fashion trends in clothing, the key queries will be: fashion, women's, men's, children's. And, so to speak, "seasonal" - autumn, winter, spring, etc.

How to build a semantic core: detailed instructions

Having decided on the list of queries for your site, you can start collecting the semantic core.

It can be done:

I. FREE using:

Wordstat Yandex

Yandex Wordstat is a very popular online service with which you can:

  • collect the semantic core of the site with statistics for the month;
  • get words similar to the query;
  • filter keywords entered from mobile devices;
  • find out statistics by city and region;
  • determine the seasonal fluctuations of the keys.

Big disadvantage: you have to “unload” the keys manually. But if you install the extension yandex wordstat assistant, work with the semantic core will speed up many times (relevant for the Opera browser).

It’s easy to use: click on the “+” next to the desired key or click “add all”. Requests are automatically redirected to the extension list. After collecting the SA, you need to transfer it to the table editor and process it. Important advantages of the program: checking for duplicates, sorting (alphabet, frequency, addition), the ability to add keys manually.

Step-by-step instructions on how to use the service are given in the article: Yandex. Wordstat (Wordstat): how to collect keywords?

Google Ads

Keyword planner from Google, which allows you to pick up a semantic core online for free. The service finds keywords based on the queries of users of the Google search engine. To work, you must have a Google account.

The service offers:

  • find new keywords;
  • see the number of requests and forecasts.

To collect the semantic core, you need to enter a query, choosing the location and language. The program shows the average number of requests per month and the level of competition. There is also information about ad impressions and the bid for displaying an ad at the top of the page.

If necessary, you can set a filter by competition, average position and other criteria.

It is also possible to request a report ( step by step instructions how to do it, the program shows).

To study traffic prediction, just enter a query or a set of keys in the "View the number of requests and forecasts" box. The information will help determine the effectiveness of the SA for a given budget and rate.

The “cons” of the service include the following: there is no exact frequency (only the average for the month); does not show encrypted Yandex keys and hides some from Google. But it determines the competitiveness and allows you to export keywords in Excel format.

SlovoEB

This is a free version of the Key Collector program, which has a lot of useful features:

  • quickly collects the semantic core from the right and left columns of WordStat;
  • performs a batch collection of search hints;
  • defines all types of frequency;
  • collects data on seasonality;
  • allows you to perform batch collection of words and frequency from Rambler.Adstat;
  • calculates KEI (Key Efficiency Ratio).

To use the service, it is enough to enter your account information in Direct (login and password).

If you want to know more, read the article: Slovoeb (Slovoeb). Basics and instructions for use

bukvariks

An easy-to-use and free semantic core collection program with a database of more than 2 billion queries.

Is different operational work, as well as useful features:

  • supports a large list of exception words (up to 10 thousand);
  • allows you to create and apply lists of words directly when forming a sample;
  • offers to make lists of words by multiplying several lists (Combinator);
  • removes duplicate keywords;
  • shows the frequency (but only "worldwide", without selecting a region);
  • analyzes domains (one or more, comparing CL resources);
  • exported in .csv format.

The only important drawback for the installation program is its large “weight” (in the archived format ≈ 28 GB, in the unpacked ≈ 100 GB). But there is an alternative - the selection of SA online.

II. PAYING with the help of programs:

Base of Maxim Pastukhov

Russian service, which has a database of more than 1.6 billion keywords with data from Yandex WordStat and Direct, as well as English, containing more than 600 million words. Works online, helps not only in creating a semantic core, but also in launching an advertising campaign in Yandex.Direct. Its most important and important disadvantage can be safely called the high cost.

Key Collector (Key Collector)

Perhaps the most popular and convenient tool for collecting the semantic core.

Key Collector:

  • collects keywords from the right and left columns of WordStat Yandex;
  • weeds out "unnecessary queries using the "Stop words" option;
  • searches for duplicates and determines seasonal keywords;
  • filters keys by frequency;
  • uploaded in Excel spreadsheet format;
  • finds pages relevant to the query;
  • collects statistics from: Google Analytics, AdWords, etc.

You can evaluate how Key Collector collects the semantic core for free in the demo.

Rush Analytics

A service that can be used to collect and cluster the semantic core.

In addition, Rush Analytics:

  • looking for hints in Youtube, Yandex and Google;
  • offers a convenient stop-word filter;
  • checks indexing;
  • determines the frequency;
  • checks the position of the site for desktops and mobile;
  • generates technical specifications for texts, etc.

Great tool, but paid: no demo and limited free checks.

Mutagen

The program collects key queries from the first 30 sites in the Yandex search engine. It shows the frequency per month, the competitiveness of each search query and recommends using words with an indicator of up to 5 (because high-quality content is enough to effectively promote such keywords).

Useful article: 8 types of texts for the site - we write correctly

A paid program for collecting a semantic core, but there is a free limit - 10 checks per day (available after the first replenishment of the budget, at least for 1 rub.). Open to registered users only.

Keyword Tool

A reliable service for creating a semantic core that:

  • in the free version– collects more than 750 keys for each request using Google, Youtube Bing, Amazon, eBay, App Store, Instagram hints;
  • in paid- shows the frequency of requests, competition, cost in AdWords and dynamics.

The program does not require registration.

In addition to the presented tools, there are many other services for collecting the semantic core of the site with detailed video reviews and examples. I settled on these, because I consider them the most effective, simple and convenient.

Conclusion. If possible, it is advisable to purchase licenses for using paid programs, since their functionality is much wider than that of their free counterparts. But for a simple collection of SA, "open" services are quite suitable.

Semantic core clustering

The finished semantic core, as a rule, includes many keywords (for example, for the request "upholstered furniture", services give out several thousand words). What to do next with such a huge number of keywords?

Collected keys need:

I. Clean up from "garbage", duplicates and "dummy"

Requests with zero frequency, as well as errors, are simply deleted. To eliminate keys with unnecessary "tails", I advise you to use the "Sort and Filter" function in Excel. What can be rubbish? For example, for a commercial site, words such as “download”, “free”, etc. will be superfluous. Duplicates can also be automatically removed in Excel using the “remove duplicates” option (see examples below).

Remove keys with zero frequency:

Delete unnecessary "tails":

Getting rid of duplicates:

II. Remove highly competitive queries

If you do not want the “path” to the top to stretch for years, exclude VK keys. With such keywords, it will not be enough just to get to the first positions in the search results, but more importantly and more difficultly, you need to try to stay there.

An example of how to determine VK keys through the Google Keyword Planner (you can leave only NK and SK through the filter):

III. Ungroup the semantic core

You can do this in two ways:

1. PAYING:

  • KeyAssort is a semantic core clusterer that helps create a website structure and find niche leaders. Works on the basis of Yandex and Google search engines. Ungroups 10,000 requests in just a couple of minutes. You can evaluate the benefits of the service by downloading the demo version.

  • SEMparser performs automatic grouping of keys; creation of the site structure; definition of leaders; generation of technical specifications for copywriters; Yandex highlight parsing; determination of geo-dependence and "commercial" requests, as well as the relevance of pages. In addition, the service checks how the text matches the top according to SEO parameters. How it works: you collect SN, save it in .xls or .xlsx format. Create on the service new project, select a region, upload a file with queries - and in a few seconds you will receive words sorted by semantic groups.

In addition to these services, I can advise more Rush Analytics, which we have already met above, and Just Magic.

Rush Analytics:

Just Magic:

2. FREE:

  • Manually- using Excel and the "Sort and Filter" function. To do this: set a filter, enter a request for the group (for example, "buy", "price"), highlight the list of keys with color. Next, set up the "Custom sorting" option (in "Sort by color") by going to "sort within the specified range". The final touch is to add names to the groups.

Step 1

Step 2

Step 3

Step 4

An example of an ungrouped semantic core:

  • SEOQUICK is a free online program for automatic clustering of the semantic core. To "scatter" the keys into groups, just upload a file with requests or add them manually and wait a minute. The tool is fast, detecting the frequency and type of the key. Allows you to remove unnecessary groups and export the document in Excel format.

  • Keyword Assistant. The service works online on the principle of an Excel table, i.e. You will have to scatter the keywords manually, but it takes much less time than working in Excel.

How to cluster the semantic core and which methods to use is up to you. I believe that the way you need it can only be done manually. It's long, but effective.

After collecting and distributing the semantic core into sections, you can start writing texts for the pages.

Read a related article with examples: How to enter keywords in the text correctly?

General rules for creating SA

Summing up, it is important to add tips that will help to assemble the right semantic core:

The SA should be designed in such a way that it meets the needs of as many potential customers as possible.

The semantics must exactly match the theme of the web project, i.e. focus only on targeted queries.

It is important that the finished semantic core includes only a few high-frequency keys, the rest is filled with medium- and low-frequency keys.

You should regularly expand the semantic core to increase natural traffic.

And most importantly: everything on the site (from the keys to the structure) must be made "for people"!

Conclusion. A well-assembled semantic core gives a real chance to quickly promote and keep the site in top positions in the search results.

If you doubt that you will be able to assemble the correct SL, it is better to order a semantic core for the site from professionals. This will save time, energy and bring more benefits.

It will also be interesting to know: How to place and speed up the indexing of an article? 5 secrets of success

That's all. I hope the material is useful to you in your work. I would be grateful if you share your experience and leave comments. Thank you for your attention! Until new online meetings!

The semantic core of the site is full set keywords corresponding to the subject of the web resource, by which users can find it in the search engine.


More videos on our channel - learn internet marketing with SEMANTICA

For example, the fairy-tale character Baba Yaga will have the following semantic core: Baba Yaga, Baba Yaga fairy tales, Baba Yaga Russian fairy tales, a woman with a fairy tale mortar, a woman with a mortar and a broom, an evil sorceress woman, a woman's hut, chicken legs, etc.

Why does a site need a semantic core

Before starting work on promotion, you need to find all the keys by which targeted visitors can search for it. Based on the semantics, a structure is compiled, keys are distributed, meta tags, document titles, descriptions for images are prescribed, and an anchor list is developed for working with the reference mass.

When compiling semantics, you need to solve the main problem: determine what information should be published in order to attract a potential client.

Compiling a list of keys solves another important task: for each search phrase, you define a relevant page that can fully answer the user's question.

This problem is solved in two ways:

  • You create the site structure based on the semantic core.
  • You distribute the selected terms according to the finished structure of the resource.

Types of key queries (KZ) by the number of views

  • LF - low frequency. Up to 100 impressions per month.
  • MF - mid-range. From 101 to 1,000 impressions.
  • HF - high frequency. Over 1000 impressions.

According to statistics, 60-80% of all phrases and words are LF. Working with them is cheaper and easier. Therefore, you must make the most voluminous core of phrases, which will be constantly supplemented with new low frequencies. HF and MF should also not be ignored, but the main focus should be on expanding the list of woofers.

Types of short circuit by type of search

  • Information is needed when searching for information. "How to fry potatoes" or "how many stars are in the sky."
  • Transactional are used to perform an action. "Order a downy scarf", "download Vysotsky's songs"
  • Navigational are used to search related to a particular company or link to the site. "Breadmaker MVideo" or "Svyaznoy smartphones".
  • Others - an extended list, according to which it is impossible to understand the ultimate goal of the search. For example, the query "Napoleon cake" - perhaps a person is looking for a recipe for its preparation, or perhaps he wants to buy a cake.

How to compose semantics

It is necessary to highlight the main terms of your business and user needs. For example, laundry customers are interested in laundry and cleaning.

Then you should define the tails and specification (more than 2 words per query) that users add to the main terms. By doing this, you will increase the reach of the target audience and reduce the frequency of terms (washing blankets, washing jackets, etc.).

Collecting the semantic core manually

Yandex Wordstat

  • Select the region of the web resource.
  • Enter a passphrase. The service will give you the number of queries with this keyword for the last month and a list of "related" terms that were of interest to visitors. Keep in mind that if you enter, for example, "buy windows", you will get results for the exact occurrence of the keyword. If you enter this key without quotes, you will get general results, and requests like "buy windows in Voronezh" and "buy a plastic window" will also be reflected in this figure. To narrow and refine the indicator, you can use the “!” operator, which is placed before each word: !buy!windows. You will get a number showing the exact output for each word. You will get a list like: buy plastic windows, buy and order windows, while the words "buy" and "windows" will be displayed unchanged. To obtain an absolute indicator for the query “buy windows”, the following scheme should be used: enter in quotes “!buy! windows”. You will receive the most accurate data.
  • Collect the words from the left column and analyze each of them. Write the initial semantics. Pay attention to the right-hand column containing short-cuts that users entered before or after searching for words from the left-hand column. You will find many more phrases you need.
  • Click on the Request History tab. On the graph, you can analyze the seasonality, the popularity of phrases in each month. Good results are obtained by working with Yandex search suggestions. Each short circuit is entered into the search field, and the semantics are expanded based on tooltips.

Google-scheduler KZ

  • Enter the main RF query.
  • Select Get Options.
  • Select the most relevant options.
  • Repeat this action with each selected phrase.

Studying competitor sites

Use this method as an additional method to determine the correct choice of a particular short circuit. BuzzSumo, Searchmetrics, SEMRush, Advse tools will help you with this.

Programs for compiling a semantic core

Consider some of the most popular services.

  • key collector. If you are compiling very voluminous semantics, then you cannot do without this tool. The program selects the semantics by referring to Yandex Wordstat, collects search suggestions of this search engine, filters short circuits with stop words, very low frequency, duplicates, determines the seasonality of phrases, studies statistics of counters and social networks, selects relevant pages for each request.
  • SlovoEB. Free service from Key Collector. The tool selects keywords, groups and analyzes them.
  • Allsubmitter. Helps to choose a short circuit, shows competing sites.
  • KeySO. Analyzes the visibility of a web resource, its competitors and helps in compiling the BR.

What to consider when choosing keywords

  • Frequency indicators.
  • Most of the short circuit should be LF, the rest - MF and HF.
  • Search-relevant pages.
  • Competitors in the TOP.
  • Phrase competition.
  • Projected number of clicks.
  • Seasonality and geodependence.
  • KZ with errors.
  • association keys.

Correct semantic core

First of all, you need to define the concepts of "keywords", "keys", "key or search queries" - these are words or phrases with which potential customers of your site are looking for the necessary information.

Make the following lists: categories of goods or services (hereinafter referred to as TU), TU names, their brands, commercial tails (“buy”, “order”, etc.), synonyms, transliteration in Latin (or in Russian, respectively), professional jargon (“keyboard” - “clave”, etc.), technical characteristics, words with possible typos and errors (“Orenburg” instead of “Orenburg”, etc.), references to the area (city, streets, etc. .).

When working with lists, be guided by the short lists from the promotion agreement, the structure of the web resource, information, price lists, competitor sites, previous SEO experience.

Proceed to the selection of semantics by mixing the phrases selected in the previous step, using the manual method or using services.

Generate a list of stop words and remove unsuitable short circuits.

Group CVs by relevant pages. For each key, the most relevant page is selected or a new document is created. It is advisable to carry out this work manually. For large projects, paid services such as Rush Analytics are provided.

Go from largest to smallest. First distribute the treble across the pages. Then do the same with the midrange. Bass can be added to pages with treble and bass distributed over them, as well as select individual pages for them.
After analyzing the first results of the work, we can see that:

  • the site being promoted is not visible for all declared keywords;
  • according to the KZ, not the documents that you assumed were relevant are issued;
  • the wrong structure of the web resource interferes;
  • for some CVs, several web pages are relevant;
  • missing relevant pages.

When grouping short circuits, work with all possible sections on a web resource, fill each page with useful information, do not create duplicate text.

Common mistakes when working with short circuit

  • only obvious semantics were chosen, without word forms, synonyms, etc.;
  • the optimizer distributed too many CVs on one page;
  • identical short circuits are distributed on different pages.

At the same time, ranking worsens, the site can be punished for spamming, and if the web resource has the wrong structure, then it will be very difficult to promote it.

It doesn't matter how you choose the semantics. At right approach you will receive the correct SL necessary for the successful promotion of the site.

The semantic core is a rather hackneyed topic, isn't it? Today we will fix it together by collecting the semantics in this tutorial!

Don't believe? - see for yourself - just drive into Yandex or Google the phrase the semantic core of the site. I think that today I will correct this annoying mistake.

But really, what is it for you - perfect semantics? You might think that this is a stupid question, but in fact it’s not even stupid at all, it’s just that most webmasters and site owners firmly believe that they can compose semantic cores and that any student can handle all this, and they also try to teach others ... But in reality, everything is much more complicated. Once I was asked - what should I do first? – the site itself and the content or sem core, and asked a man who does not consider himself a newcomer to SEO. This question also made me understand the complexity and ambiguity of this problem.

The semantic core - the foundation of the foundations - is the very first step that stands before the launch of any advertising campaign on the Internet. Along with this, the semantics of the site is the most dreary process that will require a lot of time, but it will more than pay off in any case.

Well ... let's create his together!

A small preface

To create the semantic field of the site, we need a single program - Key Collector. Using the Collector as an example, I will analyze an example of collecting a small family group. Apart from paid program, there are also free analogues like SlovoEb and others.

Semantics is collected in several basic stages, among which we should highlight:

  • brainstorming - analysis of basic phrases and preparation of parsing
  • parsing - extension of basic semantics based on Wordstat and other sources
  • dropout - dropout after parsing
  • analysis - analysis of frequency, seasonality, competition and other important indicators
  • refinement - grouping, separation of commercial and informational phrases of the core

The most important stages of the collection will be discussed below!

VIDEO - compiling a semantic core by competitors

Brainstorming when creating a semantic core - we strain our brains

At this stage, you need mentally select the semantic core of the site and come up with as many phrases as possible for our topic. So, we launch the key collector and select wordstat parsing, as shown in the screenshot:

A small window opens in front of us, where you need to enter a maximum of phrases on our subject. As I already said, in this article we will create an example set of phrases for this blog, so the phrases could be:

  • seo blog
  • seo blog
  • blog about seo
  • blog about seo
  • promotion
  • promotion project
  • promotion
  • promotion
  • blog promotion
  • blog promotion
  • blog promotion
  • blog promotion
  • article promotion
  • article promotion
  • miralinks
  • work in SAP
  • buying links
  • buying links
  • optimization
  • page optimization
  • internal optimization
  • self-promotion
  • how to promote resource
  • how to promote your site
  • how to promote your site
  • how to promote a website yourself
  • self-promotion
  • free promotion
  • free promotion
  • search engine optimization
  • how to promote a website in yandex
  • how to promote a site in Yandex
  • promotion under Yandex
  • google promotion
  • promotion in google
  • indexing
  • speed up indexing
  • site donor selection
  • donor screening
  • promotion by guards
  • use of guards
  • promotion by blogs
  • Yandex algorithm
  • update ticks
  • search database update
  • Yandex update
  • links forever
  • eternal links
  • link rental
  • leased link
  • monthly payment links
  • compiling a semantic core
  • promotion secrets
  • promotion secrets
  • SEO secrets
  • secrets of optimization

I think that's enough, and so the list is half a page;) In general, the idea is that at the first stage you need to analyze your industry to the maximum and select as many phrases as possible that reflect the theme of the site. Although, if you missed something at this stage - do not despair - missing phrases will definitely come up in the next steps, you just have to do a lot of extra work, but that's okay. We take our list and copy it to the key collector. Next, click on the button - Parse with Yandex.Wordstat:

Parsing can take quite a long time, so be patient. The semantic core is usually assembled in 3-5 days, and the first day will be spent preparing the basic semantic core and parsing.

I wrote detailed instructions on how to work with a resource, how to choose keywords. And you can find out about the promotion of the site for low-frequency requests.

In addition, I will say that instead of brainstorming, we can use ready-made semantics of competitors using one of the specialized services, for example, SpyWords. In the interface of this service, we simply enter the keyword we need and see the main competitors who are in the TOP for this phrase. Moreover, the semantics of the site of any competitor can be completely unloaded using this service.

Further, we can select any of them and pull out its queries, which will remain to be filtered out from the garbage and used as basic semantics for further parsing. Or we can do it even simpler and use .

Cleaning up semantics

As soon as wordstat parsing stops completely - it's time to weed out the semantic core. This stage is very important, so treat it with due attention.

So, my parsing is over, but the phrases turned out Lots of, and therefore, screening out words can take us extra time. Therefore, before proceeding to the definition of frequency, it is necessary to carry out the primary cleaning of words. We will do this in several steps:

1. Filter out requests with very low frequencies

To do this, click on the symbol for sorting by frequency, and start clearing all requests that have frequencies below 30:

I think that you can easily deal with this item.

2. Remove inappropriate queries

There are such requests that have sufficient frequency and low competition, but they completely irrelevant to our theme.. Such keys must be removed before checking exact occurrences of the key, as verification can take a very long time. We will delete such keys manually. So, for my blog, the following turned out to be superfluous:

  • search engine optimization courses
  • sell untwisted site

Semantic core analysis

At this stage, we need to determine the exact frequencies of our keys, for which you need to click on the magnifying glass symbol, as shown in the image:

The process is quite long, so you can go and make yourself some tea)

When the check was successful, you need to continue cleaning our kernel.

I suggest that you remove all keys with a frequency of less than 10 queries. Also, for my blog, I will delete all requests that have values ​​​​higher than 1,000, since I do not plan to advance on such requests yet.

Export and grouping of the semantic core

Do not think that this stage will be the last. Not at all! Now we need to transfer the resulting group to Excel for maximum clarity. Next, we will sort by pages and then we will see many shortcomings, which we will fix.

Exporting the semantics of the site to Excel is not difficult at all. To do this, you just need to click on the corresponding symbol, as shown in the image:

After pasting into Excel, we will see the following picture:

Columns marked in red must be deleted. Then we create another table in Excel, which will contain the final semantic core.

The new table will have 3 columns: URLpages, key phrase and his frequency. As the URL, select either an existing page or a page that will be created in the future. First, let's choose the keys for the main page of my blog:

After all the manipulations, we see the following picture. And several conclusions immediately arise:

  1. frequency queries such as should have a much larger tail of less frequent phrases than we see
  2. seo news
  3. surfaced new key, which we did not take into account earlier - SEO articles. Need to parse this key

As I said, not a single key can hide from us. The next step for us is to brainstorm these three phrases. After brainstorming, we repeat all the steps from the very first point for these keys. All this may seem too long and tedious to you, but the way it is - compiling a semantic core is a very responsible and painstaking work. On the other hand, a well-composed field will greatly help in website promotion and can greatly save your budget.

After all the operations done, we were able to get new keys for the main page of this blog:

  • best seo blog
  • seo news
  • SEO articles

And some others. I think you understand the method.

After all these manipulations, we will see which pages of our project need to be changed () and which new pages need to be added. Most of the keys we found (with a frequency of up to 100, and sometimes much higher) can be easily promoted with one .

Final elimination

In principle, the semantic core is almost ready, but there is one more rather important point that will help us significantly improve our seme group. For this we need Seopult.

*In fact, here you can use any of the similar services that allow you to find out the competition by keywords, for example, Mutagen!

So, we create another table in Excel and copy only the names of the keys there (middle column). In order not to waste a lot of time, I will only copy the keys for the main page of my blog:

Then we check the cost of getting one click for our keywords:

The cost of switching for some phrases exceeded 5 rubles. Such phrases must be excluded from our core.

Perhaps your preferences will be somewhat different, then you can exclude less expensive phrases, or vice versa. In my case, I deleted 7 phrases.

Useful information!

on compiling a semantic core, with an emphasis on screening out the most low-competitive keywords.

If you have your own online store - read, which describes how the semantic core can be used.

Semantic core clustering

I am sure that you have heard this word before in relation to search promotion. Let's figure out what kind of animal this is and why it is needed when promoting the site.
classic model search promotion as follows:

  • Selection and analysis of search queries
  • Grouping requests by site pages (creating landing pages)
  • Preparation of seo texts for landing pages based on a group of queries for these pages

To facilitate and improve the second stage in the list above, clustering is used. In essence, clustering is a software method that serves to simplify this stage when working with large semantics, but not everything is as simple as it might seem at first glance.

To better understand the theory of clustering, you should make a short digression into the history of SEO:

Just a few years ago, when the term clustering did not peek out from behind every corner, SEOs, in the vast majority of cases, grouped the semantics with their hands. But when grouping huge semantics into 1000, 10,000 and even 100,000 requests, this procedure turned into a real hard labor for an ordinary person. And then everywhere began to use the method of grouping by semantics (and today many people use this approach). The method of grouping by semantics means combining into one group of queries that have a semantic relationship. As an example, the requests “buy a washing machine” and “buy a washing machine up to 10,000” were combined into one group. And everything would be fine, but this method contains whole line critical problems and to understand them, it is necessary to introduce a new term into our narrative, namely, “ request intent”.

The easiest way to describe this term is as the user's need, his desire. An intent is nothing more than the desire of a user entering a search query.
The basis of grouping semantics is to collect requests that have the same intent, or as close as possible intents, into one group, and here 2 interesting features pop up at once, namely:

  • The same intent can have several requests that do not have any semantic similarity, for example, “car service” and “sign up for MOT”
  • Queries that have absolute semantic proximity can contain radically different intents, for example, a textbook situation - “mobile phone” and “mobile phones”. In one case, the user wants to buy a phone, and in the other, they want to watch a movie.

So, grouping semantics according to semantic correspondence does not take into account query intents. And groups made up in this way will not allow you to write a text that will get into the TOP. During the time of manual grouping, in order to eliminate this misunderstanding, guys with the profession "assistant SEO Specialist»analyzed the issuance by hand.

The essence of clustering is the comparison of the generated search engine output in search of patterns. From this definition, you should immediately make a note for yourself that clustering itself is not the ultimate truth, because the generated output may not fully reveal the intent (the Yandex database may simply not have a site that correctly combined requests into a group).

The mechanics of clustering is simple and looks like this:

  • The system in turn enters all requests submitted to it into the search results and remembers the results from the TOP
  • After entering queries one by one and saving the results, the system looks for intersections in the output. If the same site with the same document (site page) is in the TOP for several queries at once, then these queries can theoretically be combined into one group
  • A parameter such as grouping strength becomes relevant, which tells the system exactly how many intersections there must be so that requests can be added to one group. For example, a grouping strength of 2 means that there must be at least two intersections in the results for 2 different queries. To put it even more simply, at least two pages of two different sites must be simultaneously in the TOP for one and the other request. An example is below.
  • When grouping large semantics, the logic of relationships between queries becomes relevant, on the basis of which 3 basic types of clustering are distinguished: soft, middle and hard. We will talk about the types of clustering in the next entries of this diary.