How to assemble and ungroup the semantic core: a complete guide.  A simple example of compiling a semantic core Semantic core how to compose correctly examples

How to assemble and ungroup the semantic core: a complete guide. A simple example of compiling a semantic core Semantic core how to compose correctly examples

In 2008 I created my first Internet project.

It was an online electronics store that needed promotion.

Initially, the promotion work was handed over to the programmers who created it.

What to promote?

They compiled a list of keys in 5 minutes: mobile phones, camcorders, cameras, iPhones, Samsungs - all categories and products on the site.

These were common names that did not at all resemble a properly composed semantic core.

A long period passed without results.

Incomprehensible reports forced to look for performers specializing in website promotion.

I found a local company, entrusted them with the project, but even here everything was to no avail.

Then the understanding came that real professionals should be engaged in promotion.

After reading a lot of reviews, I found one of the best freelancers who assured me of success.

Six months later, no results again.

It was the lack of results in organic for two years that led me to SEO.

Subsequently, this became the main vocation.

Now I understand what went wrong in my initial promotion.

These mistakes are repeated by the bulk of even experienced SEO specialists who have spent more than one year promoting websites.

The misses were in the wrong work with keywords.

In fact, there was no understanding of what we are promoting.

There is no time to collect the semantic core, quickly fill in the contact details.








Free CR Compilation Tools

Free key search tools are essential for finding interesting ideas.

You do not have to pay for them, sometimes registration is required.

I will tell you in detail the secrets of how to get ready-made semantics from these tools.

With a list of keys, it is quite easy to assemble, there are many free and paid tools.

Let's start with 4 popular free resources that I use myself all the time.

1.1. Google keyword planner is the most versatile tool in which you can make filters by region, language.

It is interesting in that it makes a large selection of homogeneous keywords, shows traffic and the level of competition in contextual advertising.

To work requires registration in Google Adwords.

It is also important to create at least one company that you can not pay for.

All these processes are visually clear, so let's jump right into the keyword planner.

To get started with the tool, click on the wrench (upper right corner) and select “Keyword Planner”.

The screenshot shows the new version of the design.

After that, you get to a page where you can enter many options for keys, search on a relevant page, or select the desired category.

In the new design interface, we see such a window.

We will consider both options for selecting keywords.

1. OPTION

You see 2 modules.

  1. Find keywords
  2. Get request volume data and forecasts

When you go to the "Find keywords" module, You get a form for entering options for key phrases that must be written separated by commas.

As we can see, the number of received options has already expanded significantly.

In the old interface there were no more than 700 of them, in the new one we got 1365 options.

The number of options received is still inferior to paid services, which select a wider list of low-frequency queries.

In the same window, you can adjust the following functions.

  1. Search region
  2. Search Network: Google or Google + Partners
  3. Download the resulting options in csv-excel format
  4. Shows yearly data by default, can be adjusted for seasonal queries
  5. Shows the number of matches found
  6. Correcting the necessary data or adding filters (only 1 filter by default, do not show adult content).

Monthly data is broken into beautiful infographics, which is very convenient for viewing seasonal queries.

Also an important factor from which devices these keys are viewed is the desktop and mobile version.

We go below and get directly a list of keywords with frequency, minimum and maximum bid per click.

When moving to the module “Get data on the number of requests and forecasts”, we will introduce the previously considered requests.

We get conversion data for the selected keywords: spend, number of conversions, conversion value, clicks.

This is valuable information for budget planning in Google Adwords and a rough comparison with SEO.

I want to immediately upset those who plan to use only this tool.

The correctness of the data is questionable.

A well-known expert in the SEO world, Rand Fishkin, criticized the accuracy of traffic and the correctness of clustering.

Therefore, it is better to additionally use other well-known resources.

1.2. Wordstat.yandex.ru is an analogue from Yandex, which also shows traffic, homogeneous queries.

To work, you need to log in using Yandex mail or social networks.

Questions: why, who, what, how, where (why, who, what, how, where) are frequently used words in this segment.

See below for a list of popular words for voice search in the English segment.

At the same time, I want to warn you - do not overoptimize!

John Mueller, one of the Google analysts, warned about it.

There is no need to specifically modify some of the content for voice search if its quality is reduced.

Think about behavioral factors, these are the most important parameters.

1.4. Predict keys. To do this, use the free key collection utility.

I understand that the terminology is complex, so let's look at an example.

Just create one query in the first column of this kind (SYNONYM1|Synonym2|Synonym3) (Synonym4|Synonym5|Synonym6).

For example: (repair|repair|repair) (engines|engines).

In other columns, enter the regions: Moscow, MSK (in the GEO column).

In the "Region" column, write down the number of the region according to Wordstat.

Then if you press the button “GET KEYWORDS (1)” — the button “FIND MORE KEYWORDS (2)” will appear — the system will show the results of Wordstat without taking into account the words you have already used.

You can also click on the lines listed below (3) to check the results for the selected groups.

Write unnecessary words in the MINUS-words column.

The necessary ones are in other columns (for convenience, they are labeled as Properties, Types, Nomenclature, Brands, Transactional Requests, Regions, etc.).

For example, it is clear here that “do-it-yourself, video, mechanics” go into the red, and “diesel, capital, turbines, block, injectors” will be useful to us for subpages and subsections (4).

After each update of the list, click in a circle again “GET KEYWORDS (5)” and “FIND MORE KEYWORDS (6)” - and continue the cycle until there is only rubbish in the search results.

The system will substitute already used queries into the minus operator.

The convenience of the utility lies in the fact that it excludes repetitions in the Yandex search query, which greatly simplifies the work.

Ready-made lists can be transferred to Excel by clicking on each line or simply dropping them directly into the KeyCollector (previously adding a list of negative keywords to the appropriate section).

The speed of parsing semantics can be reduced from several hours to several minutes.

1.5. Ubersuggest - This tool was bought by famous SEO guru Neil Patel for $120,000.

After that, he invested another 120 thousand dollars. The United States on its improvement and does not stop there.

He also promised that Ubersuggest will always be free.

The data for this tool is pulled from Google Keyword Planner and Google Suggest.

When using it, no registrations are needed, which is also a big plus.

This tool does not have a Russian-language version, while it is possible to obtain data on Russian-language keys.

To search for a list of keys, enter a high-frequency query, select a language and a search engine.

An additional option to add a list of negative keywords to the field on the right.

The received data can be downloaded in csv-excel format.

This functionality is implemented at the bottom of the resulting list.

Paid Key Finder Tools

Paid tools are important for providing a more complete list of keys.

They also provide additional important parameters for the analysis of search keys.

I will tell you about 3 paid tools that I personally use.

Many low-frequency queries can also be picked up using SEO resources: serpstat.com, ahrefs.com, semrush.com, moz.com, keywordtool.io and others.

You do not need to pay for everything, choose the ones that suit you best.

These tools are paid, with different monthly plans.

If you need to get a one-time access, contact me for freelance.

For a small fee (from $5) I will provide you with information on your keys.

The free versions of these tools have a limited version.

To search for low-frequency keys, you must enter a high-frequency query, the selected systems independently expand the possible options.

For the query “plastic windows” using Serpstat, we received 5200 options for Yandex.Moscow, and 3500 for Google Russia.

For the same query, Ahrefs generated 7,721 variants of different keys.

By the way, Tim Soulo, Ukrainian Marketing Specialist at Ahrefs, declared, which will give a subscription for half a year to the one who shows the service that generates more keys.

The same query in keywordtool.io collected only 744 keyword options, and this tool specializes only in keywords.

I use it mainly to search for keywords for YouTube, Amazon, ebay.

After collecting the list of keys, it is important to scatter them across the pages of the site or cluster them.

I have mentioned this hard-to-pronounce word “clustering” several times already.

Let's look at it in more detail.

Let's start with a remix of a well-known tongue twister to make pronunciation easier :-)

Keyword Clustering

Grouping keywords by site pages is one of the most time-consuming tasks.

Some do it manually, some pay the appropriate services.

This is time consuming and costly.

I will show you a free quick way to group the semantic core.

One of the most common mistakes is the incorrect grouping of keywords on the pages of the site being promoted or the clustering of the semantic core.

It's like building a house and not having a building plan.

The breakdown of the list of keys by site pages is the root of any promotion.

A search key is a question asked by an Internet user to which he wants to receive a relevant answer.

Requests must match the content on the page.

Otherwise, users will start leaving your site.

The search engine will not show in the search results a resource that has bad behavioral factors.

All of the above tools, when using 3-4 words in a key phrase, reduce the time for grouping keys, while losing many different combinations.

And what if there are really a lot of keys?

Manual clustering of several thousand keys sometimes takes up to several days.

It is necessary to compare the issuance of different homogeneous keys.

If the pages in the TOP match, then the keys can be combined into one group.

It is best to consider this issue with an example.

As you can see, the same URLs are in the TOP, so there is no need to create separate pages for these requests, because users are looking for the same content.

Even if several pages match in the search results, then the keys can be combined into one group.

The main difficulty in clustering is the verification of several tens or even hundreds of thousands of keys.

In this situation, mistakes are inevitable.

People are not robots, they get tired.

Somewhere they are pressed by deadlines, they have to do the work incompletely.

This applies even to experienced SEOs.

For many years, seeing the same errors, I wanted to find a solution to this issue.

Several paid tools have appeared on the Internet that automate the work of key clustering.

But it also raises the question of quality, price and lead time.

For example, prices for clustering a list of up to 4000 keys are included in plan B at serpstat.com.

Everything you need to check on top of the plan costs $20 for 3,000 keys.

I respect the work of our colleagues who have created indispensable SEO tools, but to be honest, even for one average project, this is very little.

Only one page of the site can lead from several hundred to several thousand keys.

The pricing policy can be understood, the algorithms need to get the issue and compare the results on homogeneous pages.

These are the resources spent plus the commercial component.

At the same time, the issue is constantly changing, and the pages in the TOP are changing accordingly.

What was relevant may become irrelevant in a couple of months.

The second disadvantage is the time, which is offset by the fact that the process can be started and returned to it when it is completed.

As a rule, it takes up to several hours, depending on the download speed of the service.

We don't like to wait, let alone pay :-)

Therefore, we studied the problems of key grouping as much as possible and created our revolutionary keyword clusterer, which solves the main problems:

  • our tool offers free clustering of an unlimited list of keys (if the service is overloaded, we will introduce a limit of up to 10K keys per day);
  • performs clustering in seconds;
  • allows you to set individual settings depending on the issuance requirements;
  • removes junk and irrelevant queries;
  • combines synonyms into one group;
  • minimizes manual labor.

With the help of our clusterer, we created turnkey semantics for an English-language project from 80,000 keys in just 20 minutes!

The theme is “dating” (dating), while we did not lose sight of anything.

A month ago, I would have said that this is madness, today it is a reality.

The site has instructions on how to use the tool, as well as a “How it works” button.

Let's briefly talk about the main elements.

Important note, the fields are optional.

It all depends on the chosen keys.

For the primary test, I fill in only one field “Count as one word”.

The finished version is additionally clustered.

  • Copy the keys, as often as possible, paste them into the clusterer form. For example, from wordstat.yandex.ru or from two Excel columns. The system recognizes keys and numbers as separate components. The data in the final version is distributed correctly.
  • The second option is to load from a txt, csv, xls, xlsx file. You can simply download semantics from Serpstat, Ahrefs, Google Keyword Planner, Key Collector or other tools. There is no need to specially process them for the clusterer. The system itself will distribute everything according to the necessary parameters. If the calculator does not understand which columns refer to what, a dialog box will appear with clarification on the selected columns.
  • Next, select the frequency level: HF (high-frequency), MF (mid-frequency), LF (low-frequency), MF (microfrequency). Everything is individual here, try different options and check with real results.
  • Be sure to tick “Take into account geodependence” or not. For example, you are promoting websites in the city of Kharkov. In the TOP, many pages are not optimized for it, which means geo-dependence goes by the wayside. If your main request is “repair of refrigerators in Kharkiv”, then you need to take into account geo-dependence.
  • “Extended Clusters for Semantics” groups non-clustered queries into the most relevant groups. If this function is disabled, keys without groups will be placed in the “ Not grouped."
  • Next, fill out the form “Count as one word”. This is necessary in order to combine several words into a single whole. As a result, the system will not split phrases into separate clusters. For example: washing machine. The system will not divide words such as “washing” and “car” into 2 clusters. Other examples: clothes for newborns, iPhone 8, online electronics store. If you have several such phrases, enter them separated by commas.
  • Negative keywords are necessary to immediately weed out irrelevant keywords from the list. For example, the word "free". To avoid filtering out phrases like “free shipping”, use the exclamation point “!” operator. Then you will forbid the system to inflect this word. For example: !free.
  • The list of ignored words is those words that do not affect the results in the issue. The system automatically ignores prepositions in the Russian and English segments, so it is not necessary to enter them. For example, the phrase “Apple iPhone X”. The word "Apple" does not affect the search results in any way, because users are looking for data on the iPhone. In order not to create an extra cluster, add it to this form.
  • The last form is synonyms. For example, the words “buy”, “price”, “cost” mean the same thing for commercial queries. The system automatically recognizes them as synonyms, so it is not necessary to enter them. Enter other synonymous words: “iPhone”, “iPhone” or “choose”, “choose”, they have the same meaning in the Russian-speaking segment. If there are many synonyms, press the plus and add other options.

To get the final version, press "SEARCH" and get a clustered list.

Relevant keys are ticked.

We compared the results with paid clusterers, the accuracy of the data obtained in our tool is higher.

The convenience and speed of work in it is better than even in Excel, which is slow when adding a huge list of keys and a large number of formulas.

I would post the results of our comparisons, but I think it would be incorrect in relation to our colleagues.

Plus, on our part, it is biased to give examples that can be considered successful.

Therefore, I leave everything to the judgment of the readers.

I will be glad to hear your opinion in the comments.

Of course, our clusterer is not a magic pill that solves all problems.

Even Google tools don't show accurate data in clustering.

Our clusterer is a huge time saver.

Ready-made lists are easier to check and sort by site pages.

Promotion for low-frequency queries

Promotion on low-frequency queries is the start for any young project.

Do not try to knock out experienced large projects with a limited budget from the TOP 10.

I will show you effective ways to find low frequency keys.

The bulk of the owners of young sites initially selects high-frequency and mid-frequency queries.

These are keys like “ buy iphone”, “apartments for rent" etc.

According to them, the TOP was occupied by highly trusted sites that clearly do not want to leave it.

SEO budgets for such resources are many times higher, plus additional trust helps to promote them with less effort.

You will never move the sites with millions of traffic that everyone knows about in the TOP.

A young resource needs to concentrate on low-frequency queries, while, according to MOZ analysis, 80% of all sales on the Internet are made from low-frequency queries.

Low-frequency queries contain 4 or more words with a frequency of up to 1000 people per month.

Create content for them and get traffic in the near future.

You can search for low-frequency queries using various tools.

Let's look at the main ones.

4.1. Use search suggestions: Google, Yandex, Bing, Facebook, Twitter, Pinterest, Wikipedia, Amazon, any other sites that have this feature.

This, of course, is a lot of manual work and a real headache, but it is this approach that allows you to find the real keys for promotion.

4.2. Use forums that present your topic, especially ones like Reddit.

Find threads that have received a lot of comments on your topic.

Copy the name of the branch and create content for these keys.

Let's look at an example of how to compete on well-known queries with such monsters as Amazon, Expedia, Yelp in the American segment.

For example, you are promoting the request “ticket fly” (air tickets).

With these keys, sites such as Expedia, Kayak are ranked, which have more than 4 million traffic for branded queries alone!

Check the issue, the first 4 sites are contextual advertising.

And then there are only monsters in the organic, which have at least several million traffic.

Believe me, it is not realistic to compete with them for these keys.

You need to look for queries that these resources do not promote.

Many Western SEO companies for small commercial sites do not use keyword selection tools at all.

Enter the main query in the Reddit search.

Check out popular threads that get a lot of points and comments.

Copy the name or its main part.

For example, I entered the key “fly ticket” into the Reddit search and browse the popular branches.

Don't be fooled by predicted traffic based only on the keywords you see in the topic of the thread.

If your goal is to get to the TOP and get traffic, then you need to analyze this parameter.

Some experts check the cost per click and the level of competition in contextual advertising, but these data can differ significantly from SEO indicators.

This is more interesting for educational purposes, but not for determining the budget for SEO.

To analyze the level of competition in SEO, it is best to use ahrefs.com, majestic.com, moz.com, semrush.com.

Recently, semrush merged donor bases with majestic, so the quality of donor verification is also on top there.

Do not try to move highly competitive requests with a small budget.

It is better to focus on keywords with a low level of competition.

LSI (homogeneous queries)

Homogeneous Requests (LSIs) increase the visibility of content and, consequently, traffic.

More traffic means more sales.

I will show you all the effective LSI search methods.

LSI (Latent Semantic Index) are homogeneous queries that are shown at the bottom of the issue.

The search engine uses them for readers who did not find useful information in the TOP 10, so that they can form a query in a different way.

With the help of such keys, you can expand the content or create a new one.

It already depends on your clustering.

When promoting a site in another region, similar requests are shown under your existing IP.

In such a situation, adjustments are made by requests for your region.

If you don't want to play around with IP changes, use the Google Chrome app - GeoClever.

After installing it right in the search, you can select any city in the world, up to the little-known ones.

A quick list of search suggestions can be obtained using wordstat.yandex.ru.

To do this, after entering the main key, view the right block.

Let's check the query "SEO optimization".

As you can see, there are more options received than in Yandex and Google.

Well, if you want to collect all homogeneous requests for Youtube, Bing, Yahoo, Ebay, Amazon, Google that the software can only collect, then use Scrapebox (see point 5).

The disadvantage of this program is that it is paid, it costs $67.

For work, it requires the use of an IP base, which can be bought on the Internet.

The advantage is that a large number of search suggestions are difficult to obtain elsewhere.

Also, the software is multifunctional, it helps to automate many other manual processes.

Using Scrapebox, I collected 7786 results for the search term “SEO optimization”.

Of course, many of these keys are junk.

Use the clusterer from point 3 to filter out unnecessary keys.

Also in the program, you can check the real traffic of the selected requests.

Pareto Principle

The choice of priorities is important for obtaining results.

To do this, use the Pareto principle.

I will show you the most effective methods for choosing priority keys for promotion.

The Italian economist Vilfredo Pareto in 1886 discovered the principle that 20% of efforts give 80% of results.

He found that 20% of the Italian population owns 80% of the land area, 20% of the pea bushes produce 80% of the crop.

This principle still works today.

Dear women, it follows that men prepare in advance for congratulations.

But SEOs should be ready to promote these keywords even earlier.

Do not try to promote a highly competitive request in a short time.

It's like pumping up a month before beach season.

Who did not have time, he was late.

Prepare, as I always do, for next year.

Meta information optimization

The purpose of meta information is to tell the user what your page is about.

Meta tags also help search engines match keywords with site content.

I will show you how to properly optimize the meta-information on the site.

Picked up a list of keys, divided into pages?

And now proceed to the creation of meta-information - Title & Description.

Many people share this process and leave the writing of tags to copywriters.

Do not under any circumstances do this.

Even great copywriters mistype meta tags.

As a result, there will be no traffic, because Your keys do not match the content.

And, as we know, the more clicks, the more conversions.

Search engines will not show in the TOP sites that they do not go to.

They regard them as irrelevant.

Let's first understand what meta tags are and where they can be found.

Title is the site page code, which looks like this: This is the title of your page.

It is sewn into the page code, it is not displayed in the internal content.

You can meet it in the tab of your browser.

When sharing a page in social networks, for example on Facebook.

The most important display is when checking the results in a search.

The Description meta tag, or short description of the page, is displayed in the code like this:

The size of the Description meta tag shown in Google is about 160 characters.

This meta tag can be omitted, unlike Title.

In such a situation, the search engine selects content from your page that will be most relevant to search keys.

If you are not sure about the automatic selection of a search engine, write Description.

How to increase the clickability of the Title?

The principle is simple: in the search form, insert the URL or write a potential Title.

In the results obtained, the system gives you a score from 0 to 100 and gives recommendations for optimization.

Let's take a closer look at Title optimization techniques.

9.1. Add emoticons

It is they that attract more attention than the standard ones: 10, 20, 50, etc.

Why do you think the title of this article is “The Semantic Core of a Website: 9 Examples of Compilation (Course 2018)”?

The number "9" is more real than 10, 20, 50, 100...

Strange numbers do not cause a feeling of understatement or, on the contrary, compression of information, because in this situation we have listed our best methods for compiling CL, and did not pull on another tenth.

Using brackets increases click-through rates by 38%, according to Hubspot analysis.

This is a huge plus, because you can insert synonyms in brackets or highlight important data.

Use different brackets: round, square.

9.4. arouse curiosity

The best way to provoke a click is to arouse curiosity.

Think about what might make your audience feel that way.

Remember this emerging news that some star has died.

When you go to the site, it turns out that it was fake information.

The main goal is a click!

The best example is Dale Carnegie and the titles of his books: How to Win Friends and Influence People, How to Stop Worrying and Start Living.

These titles have been provoking people to read his work for several generations.

9.5. Include encouraging words

A lot of motivational words are included in the content, but can they also be added to the Title?

To do this, use different options: discounts, cheap, get free, download.

To search for inciting words, analyze the search results for contextual advertising.

In Google Adwords and Yandex.Direct, it is very important to attract a lot of clicks.

If your ad is not clicked, then the click will be more expensive for you, which is why contextologists pay special attention to this.

How to find motivating words, let's look at examples.

Let's enter the search keys "buy iPhone 8 Kiev".

Collect from here a base of inciting words, choose those that correspond to the content on the site.

Another technique is used when setting up remarketing.

Marketers lure those who left the cart without making a purchase with additional discounts.

This greatly increases the percentage of sales.

Try the same trick when filling out your Title.

Offer discounts, promotions. People love it very much!

9.6. Use Your Domain in Title

When I talked about the use of the domain in the Title in the Everest technique, some Internet users wrote in the comments that this is complete nonsense.

Honestly, I thought so too.

I did not understand why many sites use their brand in a short name.

Instead, you can add additional keys there.

My opinion changed drastically after I read a lot of research on the subject.

The bottom line is your openness to Internet users.

It is the addition of a brand that significantly enhances click-through.

It's best to use your branding at the end of the Title.

If it is entered at the beginning, it shifts attention away from the main keys.

9.7. Capitalize every word

It is this point that causes the most doubts among many SEOs.

They doubt that this complies with the rules of the Russian language.

In English, it is considered literate when every word in the title is capitalized.

Let's see how everything happens in contextual advertising.

As you can see from the screenshot, this technique is used not only in the Title, but also in the description.

What is it for?

Capital letters attract more attention, and the percentage of clicks increases accordingly.

In Russian-speaking organics, this technique is rarely used, so I leave everything to the readers.

Personally, I did not find the rules of the Russian language, which indicate that this is not correct.

I invite you to discuss in the comments.

CONCLUSION

Initially, I wanted to write an article about SEO-optimization of the site and started with keywords.

But in the process of creating the material, it became clear that there is a lot of information on this issue.

And so it turned out an article about the search and compilation of keywords.

To search for keys, you do not need to be limited to one tool.

This is somewhat similar to brainstorming (see point 1), where all the ideas are initially collected over several days by the whole team.

At the final stage, real ideas are distinguished from the good ones, for which there is time and resources.

It's the same with keys: initially you need to collect a huge list of queries.

To do this, it is important to use paid and free tools.

The next step is to eliminate a lot of irrelevant keys and keep only those that match your goals.

To do this, use a keyword clusterer that will collect all the keys into groups.

They must match your priorities.

Don't try to promote everything.

A bird in the hand is worth two in the bush.

Use the Pareto principle - 20% of products bring 80% of profit.

Focus on the low frequency keywords that drive 80% of all online sales.

Don't try to fight on a tight budget with big, experienced sites that put millions into promotion.

Better find your niche.

Use forums and search tips for this.

Use LSI (homogeneous queries) to expand the list of keys and existing content.

Check the seasonality of selected keywords with Google Trends.

Get ready to move ahead.

Don't put it off for a short time.

Optimize meta information for selected keywords, especially Title.

This is the second internal ranking algorithm, its attractiveness determines whether a visitor will go to your site or not.

If you are reading these lines, then you have mastered the article, for which I am incredibly grateful to you.

I propose to continue the discussion in the comments.

The semantic core (abbreviated as SN) is a specific list of keywords that describe the theme of the site as much as possible.

Why you need to make up the semantic core of the site

  • the semantic core characterizes, it is thanks to it that robots indexing the page determine not only the naturalness of the text, but also the subject in order to bring the page into the appropriate search section. It is obvious that robots work on full autonomy after entering the address of the site page into the search resource base;
  • well-written is the semantic basis of the site and reflects the appropriate structure for SEO promotion;
  • each page of the site, respectively, is tied to a certain part of the CL of the web resource;
  • thanks to the semantic core, a promotion strategy in search engines is formed;
  • according to the semantic core, you can estimate how much the promotion will cost.

Basic rules for compiling a semantic core

    In order to assemble the SA, you will need to assemble sets of keywords. In this regard, you need to evaluate your strength in relation to the promotion of high- and mid-frequency queries. If you want to get the most visitors on a budget, you need to use high- and mid-frequency queries. If vice versa, then medium- and low-frequency requests.

    Even if you have a high budget, it makes no sense to promote the site only for high-frequency queries. Often, such requests are too general and have an unspecified semantic load, for example, “listen to music”, “news”, “sports”.

When choosing search queries, they analyze a set of indicators that correspond to the search phrase:

  • number of impressions (frequency);
  • the number of impressions without morphological changes and word combinations;
  • pages that are issued by the search engine when entering a search query;
  • pages in the search TOP for key queries;
  • estimate the cost of promotion on demand;
  • keyword competition;
  • predicted number of transitions;
  • bounce rate (closing the site after clicking on the link) and seasonality of the service;
  • keyword geo-dependency (geographic location of the company and its customers).

How to build a semantic core

In practice, the selection of the semantic core can be carried out by the following methods:

    Competitor websites can become the source of keywords for the semantic core. This is where you can quickly pick up keywords, as well as determine the frequency of their "environment" using semantic analysis. To do this, it will be necessary to make a semantic assessment of the text page, the most mentioned words make up the morphological core;

    We recommend creating your own semantic core based on the statistics of special services. Use, for example, Wordstat Yandex - the statistical system of the Yandex search engine. Here you can see the frequency of the search query, as well as find out what users are searching for along with this keyword;

    "Hints" systems appear when you try to enter a search phrase in the interactive line. These words and phrases can also enter the SL as connected ones;

    Closed databases of search queries, for example, Pastukhov's database, can become the source of keywords for CL. These are special data arrays containing information about effective combinations of search queries;

    Internal site statistics can also become a source of data about search queries that are of interest to the user. It contains information about the source and knows where the reader came from, how many pages they viewed and what browser they came from.

Free tools for compiling a semantic core:

Yandex.Wordstat- a popular free tool used in compiling a semantic core. Using the service, you can find out how many times visitors entered a specific query into the Yandex search engine. Provides an opportunity to analyze the dynamics of demand for this query by months.

Google AdWords is one of the most used systems for leaving the semantic core of the site. With the help of the Google Keyword Planner, you can calculate and forecast the impressions of specific queries in the future.

Yandex.Direct many developers use the most profitable keywords for selection. If in the future it is planned to place advertisements on the site, then the owner of the resource with this approach will receive a good profit.

Slovoeb- the younger brother of Kay Collector, who is used to compile the semantic core of the site. Data from Yandex is taken as a basis. Among the advantages, one can note an intuitive interface, as well as accessibility not only for professionals, but also for beginners who are just starting to engage in SEO analytics.

Paid tools for compiling a semantic core:

Base Pastukhov according to many experts have no competitors. The database displays such requests that neither Google nor Yandex shows. There are many other features inherent in Max Pastukhov's databases, among which a convenient software shell can be noted.

SpyWords- An interesting tool that allows you to analyze the keywords of competitors. With its help, you can conduct a comparative analysis of the semantic cores of the resources of interest, as well as get all the data on the PPC and SEO companies of competitors. The resource is Russian-speaking, it will not be any problem to deal with its functionality.

A paid program created specifically for professionals. Helps to compose the semantic core by identifying relevant queries. It is used to estimate the cost of promoting a resource for the keywords of interest. In addition to a high level of efficiency, this program compares favorably with ease of use.

SEMrush allows you to determine the most effective keywords based on data from competing resources. It can be used to select low-frequency queries characterized by a high level of traffic. As practice shows, for such requests it is very easy to promote the resource to the first positions of the issue.

SeoLib- a service that has won the trust of optimizers. Has a lot of functionality. Allows you to competently compose a semantic core, as well as perform the necessary analytical measures. In free mode, you can analyze 25 requests per day.

promoter allows you to assemble the primary semantic core in just a few minutes. This is a service used mainly for the analysis of competing sites, as well as for the selection of the most effective key queries. Word analysis is selected for Google in Russia or for Yandex in the Moscow region.

The semantic core is assembled fairly quickly if sources and databases are used as a hint.

The following processes should be distinguished

According to the content of the site and relevant topics, key queries are selected that most accurately reflect the semantic load of your web portal.
- From the selected set, superfluous ones are eliminated, possibly those queries that can worsen the indexing of the resource. Keyword filtering is carried out based on the results of the analysis described above.
- The resulting semantic core should be evenly distributed between the pages of the site, if necessary, texts with a specific theme and volume of keywords are ordered.

An example of collecting a semantic core using the Wordstat Yandex service

For example, you are promoting a nail salon in Moscow.

We think and select all kinds of words that fit the theme of the site.

Activity of the company

  • manicure salon;
  • nail service salon;
  • nail service studio
  • manicure studio;
  • pedicure studio;
  • nail design studio

General service name

Pedicure;
- manicure;
- nail extension.

Now we go to the Yandex service and enter each request, after selecting the region in which we are going to move.

We copy all the words in Excel from the left column, plus auxiliary phrases from the right.

We remove unnecessary words that do not fit the topic. The words that match are highlighted in red below.

The figure of 2320 requests shows how many times people typed this request not only in its pure form, but also as part of other phrases. For example: manicure and price in Moscow, price for manicure and pedicure in Moscow, etc.

If you enter our query in quotation marks, then there will already be another figure, which takes into account the word forms of the key phrase. for example: manicure prices, manicure prices, etc.

If you enter the same query query in quotation marks with exclamation marks, we will see how many times users typed the query "manicure price".

Next, we break down the resulting list of words into pages of the site. So, for example, we will leave high-frequency requests on the main page and on the main sections of the site, such as: manicure, nail service studio, nail extension. We will distribute the mid- and low-frequency frequencies on the rest of the pages, for example: manicure and pedicure prices, gel nail extension design. Words should also be divided into groups according to meaning.

  • Main page - studio, nail service salon, etc.
  • 3 sections - pedicure, manicure, prices for manicure and pedicure.
  • Pages - nail extension, hardware pedicure, etc.

What mistakes can be made when compiling a

When compiling a semantic core, no one is immune from errors. The most common include the following:

  1. There is always the danger of choosing inefficient queries that generate the minimum number of visitors.
  2. When re-promotion of the site, you should not completely change the content posted on the site. Otherwise, all previous parameters will be reset to zero, including ranking in search results.
  3. You should not use queries that are incorrect for the Russian language, search robots already identify such queries well and remove the page from the search when they “spam” with keywords.

We wish you good luck in promoting your site!

The semantic core of the site is a complete set of keywords corresponding to the subject of the web resource, by which users can find it in the search engine.


More videos on our channel - learn internet marketing with SEMANTICA

For example, the fairy-tale character Baba Yaga will have the following semantic core: Baba Yaga, Baba Yaga fairy tales, Baba Yaga Russian fairy tales, a woman with a fairy tale mortar, a woman with a mortar and a broom, an evil sorceress woman, a woman's hut, chicken legs, etc.

Why does a site need a semantic core

Before starting work on promotion, you need to find all the keys by which targeted visitors can search for it. Based on the semantics, a structure is compiled, keys are distributed, meta tags, document titles, descriptions for images are prescribed, and an anchor list is developed for working with the reference mass.

When compiling semantics, you need to solve the main problem: determine what information should be published in order to attract a potential client.

Compiling a list of keys solves another important task: for each search phrase, you define a relevant page that can fully answer the user's question.

This problem is solved in two ways:

  • You create the site structure based on the semantic core.
  • You distribute the selected terms according to the finished structure of the resource.

Types of key queries (KZ) by the number of views

  • LF - low frequency. Up to 100 impressions per month.
  • MF - mid-range. From 101 to 1,000 impressions.
  • HF - high frequency. Over 1000 impressions.

According to statistics, 60-80% of all phrases and words are LF. Working with them is cheaper and easier. Therefore, you must make the most voluminous core of phrases, which will be constantly supplemented with new low frequencies. HF and MF should also not be ignored, but the main focus should be on expanding the list of woofers.

Types of short circuit by type of search

  • Information is needed when searching for information. "How to fry potatoes" or "how many stars are in the sky."
  • Transactional are used to perform an action. "Order a downy scarf", "download Vysotsky's songs"
  • Navigational are used to search related to a particular company or link to the site. "Breadmaker MVideo" or "Svyaznoy smartphones".
  • Others - an extended list, according to which it is impossible to understand the ultimate goal of the search. For example, the query "Napoleon cake" - perhaps a person is looking for a recipe for its preparation, or perhaps he wants to buy a cake.

How to compose semantics

It is necessary to highlight the main terms of your business and user needs. For example, laundry customers are interested in laundry and cleaning.

Then you should define the tails and specification (more than 2 words per query) that users add to the main terms. By doing this, you will increase the reach of the target audience and reduce the frequency of terms (washing blankets, washing jackets, etc.).

Collecting the semantic core manually

Yandex Wordstat

  • Select the region of the web resource.
  • Enter a passphrase. The service will give you the number of queries with this keyword for the last month and a list of "related" terms that were of interest to visitors. Keep in mind that if you enter, for example, "buy windows", you will get results for the exact occurrence of the keyword. If you enter this key without quotes, you will get general results, and queries like "buy windows in Voronezh" and "buy a plastic window" will also be reflected in this figure. To narrow and refine the indicator, you can use the “!” operator, which is placed before each word: !buy!windows. You will get a number showing the exact output for each word. You will get a list like: buy plastic windows, buy and order windows, while the words "buy" and "windows" will be displayed unchanged. To obtain an absolute indicator for the query “buy windows”, the following scheme should be used: enter in quotes “!buy! windows”. You will receive the most accurate data.
  • Collect the words from the left column and analyze each of them. Write the initial semantics. Pay attention to the right-hand column containing short-cuts that users entered before or after searching for words from the left-hand column. You will find many more phrases you need.
  • Click on the Request History tab. On the graph, you can analyze the seasonality, the popularity of phrases in each month. Good results are obtained by working with Yandex search suggestions. Each short circuit is entered into the search field, and the semantics are expanded based on tooltips.

Google-scheduler KZ

  • Enter the main RF query.
  • Select Get Options.
  • Select the most relevant options.
  • Repeat this action with each selected phrase.

Studying competitor sites

Use this method as an additional method to determine the correct choice of a particular short circuit. BuzzSumo, Searchmetrics, SEMRush, Advse tools will help you with this.

Programs for compiling a semantic core

Consider some of the most popular services.

  • key collector. If you are compiling very voluminous semantics, then you cannot do without this tool. The program selects the semantics by referring to Yandex Wordstat, collects search suggestions of this search engine, filters short keywords with stop words, very low frequency, duplicates, determines the seasonality of phrases, studies statistics of counters and social networks, selects relevant pages for each request.
  • SlovoEB. Free service from Key Collector. The tool selects keywords, groups and analyzes them.
  • Allsubmitter. Helps to choose a short circuit, shows competing sites.
  • KeySO. Analyzes the visibility of a web resource, its competitors and helps in compiling the BR.

What to consider when choosing keywords

  • Frequency indicators.
  • Most of the short circuit should be LF, the rest - MF and HF.
  • Search-relevant pages.
  • Competitors in the TOP.
  • Phrase competition.
  • Projected number of clicks.
  • Seasonality and geodependence.
  • KZ with errors.
  • association keys.

Correct semantic core

First of all, you need to define the concepts of "keywords", "keys", "key or search queries" - these are words or phrases with which potential customers of your site are looking for the necessary information.

Make the following lists: categories of goods or services (hereinafter referred to as TU), TU names, their brands, commercial tails (“buy”, “order”, etc.), synonyms, transliteration in Latin (or in Russian, respectively), professional jargon (“keyboard” - “clave”, etc.), technical characteristics, words with possible typos and errors (“Orenburg” instead of “Orenburg”, etc.), references to the area (city, streets, etc. .).

When working with lists, be guided by the short list from the promotion agreement, the structure of the web resource, information, price lists, competitor sites, previous SEO experience.

Proceed to the selection of semantics by mixing the phrases selected in the previous step, using the manual method or using services.

Generate a list of stop words and remove unsuitable short circuits.

Group CVs by relevant pages. For each key, the most relevant page is selected or a new document is created. It is advisable to carry out this work manually. For large projects, paid services such as Rush Analytics are provided.

Go from largest to smallest. First distribute the treble across the pages. Then do the same with the midrange. Bass can be added to pages with treble and bass distributed over them, as well as select individual pages for them.
After analyzing the first results of the work, we can see that:

  • the site being promoted is not visible for all declared keywords;
  • according to the KZ, not the documents that you assumed were relevant are issued;
  • the wrong structure of the web resource interferes;
  • for some CVs, several web pages are relevant;
  • missing relevant pages.

When grouping short circuits, work with all possible sections on a web resource, fill each page with useful information, do not create duplicate text.

Common mistakes when working with short circuit

  • only obvious semantics were chosen, without word forms, synonyms, etc.;
  • the optimizer distributed too many CVs on one page;
  • the same short circuits are distributed on different pages.

At the same time, ranking worsens, the site can be punished for spamming, and if the web resource has the wrong structure, then it will be very difficult to promote it.

It doesn't matter how you choose the semantics. With the right approach, you will get the right SEO, which is necessary for the successful promotion of the site.

Quick navigation on this page:

Like almost all other webmasters, I create a semantic core using the KeyCollector program - this is by far the best program for compiling a semantic core. How to use it is a topic for a separate article, although the Internet is full of information on this subject - I recommend, for example, a manual from Dmitry Sidash (sidash.ru).

Since the question was raised about the example of compiling the kernel, I give an example.

List of keys

Suppose we have a site dedicated to British cats. I drive the phrase "British cat" into the "List of phrases" and click on the "Parse" button.

I get a long list of phrases, which will begin with the following phrases (the phrase and particularity are given):

british cats 75553 british cats photo 12421 british fold cat 7273 british fold cattery 5545 cats of british breed 4763 british shorthair cat 3571 colors of british cats 3474 british cats price 2461 cat blue british 2302 british fold cat photo 2224 mating 18 british cats 18 buy a cat 1179 british cats 1179 longhair british cat 1083 pregnancy of british cat 974 cat british chinchilla 969 cats of british breed photo 953 cattery of british cats moscow 886 color of british cats photo 882 british cats care 855 british shorthair cat photo 840 scottish and british cats 763 names of british cats 762 British blue cat photo 723 photo of British blue cat 723 British black cat 699 what to feed British cats 678

The list itself is much longer, I just gave the beginning of it.

Key grouping

Based on this list, I will have articles on the site about the varieties of cats (lop-eared, blue, short-haired, long-haired), there will be an article about the pregnancy of these animals, about what to feed them, about names, and so on down the list.

For each article, one main such query is taken (= the topic of the article). However, the article is not limited to only one request - other queries that are suitable in meaning are also added to it, as well as different variations and word forms of the main query, which can be found in the Key Collector below in the list.

For example, with the word "lop-eared" there are the following keys:

British Fold cat 7273 British Fold cat photo 2224 British Fold cat price 513 cat photo 129 british fold cats character 112 british fold cat grooming 112 mating of british fold cats 98 british shorthair fold cat 83 color of british fold cats 79

In order to avoid overspam (and overspam can also be due to the combination of using too many keys in the text, in the title, in, etc.), I would not take them all with the inclusion of the main query, but individual words from them make sense use in the article (photo, buy, character, care, etc.) in order for the article to rank better for a large number of low-frequency queries.

Thus, under the article about lop-eared cats, we will form a group of keywords that we will use in the article. Groups of keywords for other articles will be formed in the same way - this is the answer to the question of how to create the semantic core of the site.

Frequency and competition

There is also an important point related to the exact frequency and competition - they must be collected in the Key Collector. To do this, select all queries with checkboxes and on the "Yandex.Wordstat frequencies" tab, click the "Collect frequencies "!" - the exact particularity of each phrase will be shown (that is, with this word order and in this case), this is a much more accurate indicator than the overall frequency.

To check the competition in the same Key Collector, you need to click the "Get data for Yandex PS" (or for Google), then click "Calculate KEI from the available data". As a result, the program will collect how many main pages for this query are in the TOP-10 (the more - the more difficult it is to get there) and how many pages in the TOP-10 contain such a title (similarly, the more - the more difficult it is to break into the top).

Then we need to act on the basis of our strategy. If we want to create a comprehensive site about cats, then the exact frequency and competition are not so important to us. If we only need to publish a few articles, then we take queries that have the highest frequency and at the same time the lowest competition, and write articles based on them.

The semantic core is a scary name that SEOs have come up with to refer to a fairly simple thing. We just need to select the key queries for which we will promote our site.

And in this article, I will show you how to properly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. Here, too, there are "secrets".

And before we move on to compiling the SA, let's look at what it is, and what we should eventually come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is a regular excel file, in which the list contains key queries for which you (or your copywriter) will write articles for the site.

For example, here is how my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for whom I am going to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key request, I have determined the frequency, competitiveness, and invented a "catchy" title. Here is approximately the same file you should get. Now my SL consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

A little lower we will talk about what you should prepare for if you suddenly decide to order the collection of a semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of "keys". However, in SA it is not the quantity that matters, but the quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, in the end, just write high-quality articles just like that, and attract an audience with this, right? Yes, you can write, but you can’t attract.

The main mistake of 90% of bloggers is just writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don't know about it. They are not psychics, but just robots. Accordingly, they do not put your article in the TOP.

There is another subtle point here with the title. For example, you have a very high-quality article on the topic "How to do business in the" muzzle book ". There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the most high-quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why quality articles fly out of the TOP

Imagine that your site was visited not by a robot, but by a live checker (assessor) from Yandex. He understood that you have the coolest article. And the hands put you in first place in the search results for the query "Community promotion on Facebook."

Do you know what will happen next? You will be out of there very soon. Because no one will click on your article, even in the first place. People enter the query "Community promotion on Facebook", and your headline is "How to do business in the" muzzle book ". Original, fresh, funny, but ... not on demand. People want to see exactly what they were looking for, not your creative.

Accordingly, your article will empty take a place in the TOP of the issue. And a living assessor, an ardent admirer of your work, can beg the authorities for as long as he likes to leave you at least in the TOP-10. But it won't help. All the first places will be occupied by empty, like husks from seeds, articles that were copied from each other by yesterday's schoolchildren.

But these articles will have the correct “relevant” title - “Community promotion on Facebook from scratch” ( step by step, 5 steps, from A to Z, free etc.) It's a shame? Still would. Well, fight against injustice. Let's make a competent semantic core so that your articles take the well-deserved first places.

Another reason to start compiling SA right now

There is one more thing that for some reason people don't think much about. You need to write articles often - at least every week, and preferably 2-3 times a week to get more traffic and faster.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they can’t force themselves”, “just laziness”. But in fact, the whole problem is precisely in the absence of a specific semantic core.

I entered one of my basic keys — “smm” into the search field, and Yandex immediately gave me a dozen hints about what else might be of interest to people who are interested in “smm”. I just have to copy these keys into a notebook. Then I will check each of them in the same way, and collect clues on them as well.

After the first stage of collecting CL, you should get a text document, which will contain 10-30 wide base keys, with which we will work further.

Step #2 - Parsing Basic Keys in SlovoEB

Of course, if you write an article for the query "webinar" or "smm", then a miracle will not happen. You will never be able to reach the TOP for such a broad query. We need to break the base key into many small queries on this topic. And we will do this with the help of a special program.

I use KeyCollector but it's paid. You can use a free analogue - the SlovoEB program. You can download it from the official site.

The most difficult thing in working with this program is to set it up correctly. How to properly set up and use Slovoeb I show. But in that article, I focus on the selection of keys for Yandex-Direct.

And here let's take a look at the features of using this program for compiling a semantic core for SEO step by step.

First we create a new project and name it according to the broad key you want to parse.

I usually give the project the same name as my base key so I don't get confused later. And yes, I will warn you against another mistake. Don't try to parse all base keys at the same time. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the "Worstat" button in the program interface, enter your base key, and click "Start collecting".

For example, let's parse the base key for my blog "contextual advertising".

After that, the process will start, and after a while the program will give us the result - up to 2000 key queries that contain "contextual advertising".

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise you to draw any conclusions from these figures.

Step #3 - Gathering the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then do not be surprised later when your key for 1000 requests does not bring a single click per month.

We need to find the net frequency. And for this, we first select all the found keys with checkmarks, and then click on the Yandex Direct button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what request was entered by Internet users over the past month. Now I propose to group all key queries by frequency, so that it would be more convenient to work with them.

To do this, click on the "filter" icon in the "Frequency "!" ”, and specify to filter out keys with the value “less than or equal to 10”.

Now the program will show you only those requests, the frequency of which is less than or equal to the value "10". You can delete these queries or copy them for the future to another group of keywords. Less than 10 is very low. Writing articles for these requests is a waste of time.

Now we need to choose those keywords that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of competition of the request.

Step #4 - Checking for Query Concurrency

All "keys" in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). And they can also be highly competitive (VC), medium competitive (SC) and low competitive (NC).

As a rule, HF requests are simultaneously VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to advance on it. But this is not always the case, there are happy exceptions.

The art of compiling a semantic core lies precisely in finding such queries that have a high frequency, and their level of competition is low. Manually determining the level of competition is very difficult.

You can focus on indicators such as the number of main pages in the TOP-10, the length and quality of texts. the level of trust and sites in the TOP of the issue on request. All of this will give you some idea of ​​how tough the competition for positions is for this particular request.

But I recommend that you use service Mutagen. It takes into account all the parameters that I mentioned above, plus a dozen more, which neither you nor I probably even heard of. After analysis, the service gives the exact value - what is the level of competition for this request.

Here I checked the request "setting up contextual advertising in google adwords". The mutagen showed us that this key has a concurrency of "more than 25" - this is the maximum value that it shows. And this request has only 11 views per month. So it doesn't suit us.

We can copy all the keys that we picked up in Slovoeb and do a mass check in Mutagen. After that, we will only have to look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of verification is very low. For all the time I have worked with him, I have not yet spent even 300 rubles.

By the way, at the expense of the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, at the expense of the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 - Collecting "tails" for the selected keys

As has been proven and verified many times, your site will receive the bulk of traffic not from the main keys, but from the so-called “tails”. This is when a person enters strange key queries into the search box, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the "tail" - just go to Yandex and enter your chosen key query in the search bar. Here's what you'll see.

Now you just need to write out these additional words in a separate document, and use them in your article. At what it is not necessary to put them always next to the main key. Otherwise, search engines will see "re-optimization" and your articles will fall in the search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - "Setting up contextual advertising". Here's how you can reformulate it:

  • Setup = set up, make, create, run, launch, enable, host…
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords…

You never know exactly how people will look for information. Add all these additional words to your semantic core and use when writing texts.

So, we collect a list of 100 - 150 keywords. If you are compiling a semantic core for the first time, then it may take you several weeks to complete it.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of CL to specialists who will do it better and faster? Yes, there are such specialists, but it is not always necessary to use their services.

Is it worth ordering SA from specialists?

By and large, specialists in compiling a semantic core will only take you steps 1 - 3 of our scheme. Sometimes, for a large additional fee, they will also take steps 4-5 - (collecting tails and checking the competition of requests).

After that, they will give you several thousand key queries with which you will need to work further.

And the question here is whether you are going to write articles yourself, or hire copywriters for this. If you want to focus on quality, not quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose those topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in SA? Agree, parsing the base key and collecting the exact frequencies (steps #1-3) is not at all difficult. It will take you literally half an hour.

The most difficult thing is to choose high-frequency requests that have low competition. And now, as it turns out, you need HF-NC, on which you can write a good article. This is exactly what will take you 99% of the time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When the services of SA specialists are useful

Another thing is if you initially plan to attract copywriters. Then you do not need to understand the subject of the request. Your copywriters will also not understand it. They will simply take a few articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many. On your own, you can write a maximum of 2-3 quality articles per week. And the army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some kind of traffic.

In this case, yes, calmly hire SA specialists. Let them also draw up TK for copywriters at the same time. But you understand, it will also cost some money.

Summary

Let's go over the main ideas in the article again to consolidate the information.

  • The semantic core is just a list of keywords for which you will write articles on the site for promotion.
  • It is necessary to optimize the texts for the exact key queries, otherwise your even the highest quality articles will never reach the TOP.
  • SL is like a content plan for social networks. It helps you not to fall into a “creative block”, and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compile the semantic core, it is convenient to use the free Slovoeb program, you only need it.
  • Here are five steps in compiling CL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for requests; 4 - Checking the competitiveness of keys; 5 - Collection of "tails".
  • If you want to write articles yourself, then it is better to make the semantic core yourself, for yourself. Specialists in the compilation of CL will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is entirely possible to involve delegating and compiling the semantic core. If only there was enough money for everything.

I hope this guide was helpful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (squeezed from personal experience over 10 years =)

See you later!

Your Dmitry Novoselov