There is no website that is immune to filters. search engines(with some exceptions like Wikipedia). Even a large, white and fluffy site can fall under the filters, this may be the result of an algorithm error or for other reasons (hacking, viruses), but no one is insured. For a good site, the probability of getting under the filter is extremely small, and if erroneous sanctions are imposed on them, then they are usually removed fairly quickly upon request to the support service.

Filters were invented for a reason, they perform a number of tasks that can be divided into two groups:

  • Punishment of sites for non-compliance with the recommendations of search engines and attempts to manipulate the issue;
  • Release of computing resources of search engines. This happens when the site is so bad that it's not even worth indexing.

The vast majority of filters belong to the first group, the Yandex AGS filter belongs to the second group, Google has additional search results (details below), and it may not index sites or individual pages that it considers too bad.

Signs and effects of filters

The features of search engine filters are as follows:

  • Reducing search traffic;
  • Reduction of all or a large group of positions in the issue (one-time);
  • Confluence of the site from the index or individual pages;
  • New pages are no longer indexed;
  • The site is not in the first places in the search results for vital queries;
  • A number of other features, which I will talk about in the description of the filters.

Accordingly, for any reason, you will experience a partial or complete loss of natural search traffic. This is the main consequence for commercial sites. Any negative changes may be a sign of a search engine filter and should be a reason to check for possible penalties.

Should I be afraid of search engine filters?

If you are making a good site, on which everything is correct, then in this case you should not be afraid. You will not be filtered for minor errors, and if you do get sanctioned, it can usually be easily fixed, although in some cases it can take a long time. If the algorithms made a mistake and you were wrongly punished, then everything can be corrected here much faster.

Think also about the fact that filters protect you from those who are not ready for a fair and competitive struggle for positions in search engines, but who are trying to get out of trying to manipulate the factors that affect positions in the issue. By the way, the search engines have their own logic here. They are completely satisfied with the situation when website promotion becomes more and more difficult and expensive, because they are waiting for you in Yandex Direct or Google AdWords.

Yandex filters

The most famous and one of the oldest filters of the domestic search engine. The first version was called AGS-17, the last - AGS-2015 (an unofficial name, Yandex itself did not assign a number to it).

The main task of this filter is to remove bad sites from the search results. A sign of the first versions of the AGS is the complete loss of the site from the index, sometimes only a few pages could remain. AGS is imposed for:

  • Non-unique and low-quality content;
  • A large number of outgoing SEO links;
  • Technical problems (doubles, cloaking);
  • Lots of ads, bad design.

The result is an almost complete loss of traffic from Yandex search. For some factors individually, Yandex may not impose an ACS, but if there are several of them, then the risk is very high. For example, the site has a lot of SEO links, but at the same time it has good behavioral factors, the content is good, there are no technical and other problems. In this case, the site may not fall under the filter. That was until 2015.

In 2015, Yandex updated the AGS and now it can punish even a very good site for trading SEO links. Here, the sign will be the zeroing of the TIC, while the pages will remain in the search results, but the site may lose some of the traffic. That is, this is an alternative version of AGS, softer. Here is an example of traffic on a site that briefly came under ACS in the fall of 2015:

The decline in the number of visitors is very clearly visible. After some of the links were taken down, the traffic recovered, albeit to a lesser extent.

If your site fell under the AGS-2015 and your TIC was reset to zero, then the only solution would be to remove SEO links (at least some of them). The site must accumulate a certain critical mass of links, after which it is punished. The calculation algorithm is unknown, but it is obvious that for each site the number of SEO links is determined individually.

When the pages of your site fly out of the index, it will no longer be limited to links alone. Pay attention to all the factors that could lead to AGS. Eliminate all problems, if you can’t find them, contact specialists for an audit. To speed up the exit from the ACS, you can write to the Yandex support service, but you need to write when you are sure that everything is fine with the site.

Minusinsk

Appeared in 2015, punishes sites for external SEO links. The signs are the following:

  • A sharp decrease in positions in the issuance and a drop in traffic;
  • The site is not on the first page of the search results for vital queries.

The main reason for falling under Minusinsk is links from bad sites and spam anchors. No one knows exactly how many such links are needed, there are cases when sites with tens of thousands fell under this filter, and there were those who were punished for several hundred.

The only way to remove Minusinsk is to reduce the number or completely remove external links. At a minimum, you need to remove the worst ones, from sites under AGS and with spam anchors. Unfortunately, Yandex does not provide any tools (like Google does), so if you bought permanent links, you will have to negotiate with the site owners to remove them. In the GoGetLinks and Miralinks exchanges, after the release of Minusinsk, they made it possible to send requests to remove links.

Behavioral cheating

As soon as the optimizers learned for sure that behavioral factors can greatly influence positions, services appeared that provided them with cheating. Yandex did not look at such manipulations, improved the filter that already existed, and also toughened the punishment. The signs are:

  • A sharp drop in positions and traffic;
  • New pages are slowly indexed.

You can get out of the filter for cheating behavioral factors in Yandex only by completely stopping these manipulations. Yandex itself will release you from this filter, but you can wait several months. I recommend that you stop trying to cheat behavioral factors, otherwise punishment from Yandex will be inevitable.

Affiliates

The filter is designed to deal with companies that make several sites and try to capture all the first places in the search results. There is only one sign of this filter and it is very bright: only one site will remain in the search results. Yandex identifies affiliates by the same contact information, similar topics (content) and WHOIS data (if they are not hidden). It also takes into account the structure of sites, the IP on which they are located, CMS and other little things that can point to the same owner of different sites.

You can get out of this filter only if you change all the data by which Yandex can identify affiliates and write to the support service.

adult content

If Yandex decides that your site contains adult content, it will completely exclude it from the search results for queries that do not relate to this topic. Most often, the reason is a banal hacking of the site, after which pages with similar content are uploaded to it. As soon as you see that your site is visited by 18+ requests, you need to urgently find and eliminate the cause without waiting for the filter.

Nepot filter

Yandex never acknowledged the existence of this filter, however, almost all optimizers are inclined to believe that it exists. It is imposed on sites that are too active in trading links. After applying the non-pot filter, links from such sites are no longer taken into account. It is quite possible that today it was replaced by AGS-2015, which resets the TIC, which clearly indicates that the site is punished precisely for link trading.

Filter for reoptimization

Yandex will punish you for trying to manipulate SERPs with too many key phrases. Unlike AGS, this filter will leave the pages of the site in the index, but they will not get high positions, and accordingly there will be no traffic. Exact percentage keywords in the text for which Yandex can punish you, you cannot name it, this is determined individually, but you should avoid too many keys, especially unnatural ones. Don't overuse keywords in titles.

Filter for shock ads

At risk are sites that use teaser networks for monetization, which often contain ads with so-called shock content. Such sites lower in search results, you can get rid of this filter if you remove bad ads.

In the beta version of Yandex Webmaster, a section has appeared in which you can see the imposed sanctions:

Google Filters

Penguin (Penguin)

Penguin from Google - a filter for manipulating external reference mass, Minusinsk from Yandex is just an analogue of Penguin. The punishment follows a large number of links from low-quality resources, for invisible links and for spammy designs in anchors. The signs are the following:

  • Decrease in positions and traffic;
  • New pages are indexed slowly;
  • Some pages may end up in an additional index;
  • Information about manual actions taken in Google Search Console.

Google announces the release of the next version of the Penguin, usually writes about it on its Twitter this employee companies, information can also be found on SEO forums. If traffic to your site began to drop sharply just during the Penguin update, then you can be one hundred percent sure that you fell under this filter.

The principle of getting out of this filter is exactly the same as in the case of Minusinsk Yandex, you need to remove bad links. But Google has a handy tool that allows you to reject unwanted links, you can find it at this link. AT individual cases this tool will not be enough and you still have to remove unwanted links. This will be communicated to you in response to your site review request.

Panda (Panda)

This filter penalizes internal optimization, that is, it evaluates the site as a whole. The reasons may be as follows:

  • Content of very low quality or stolen;
  • The presence of duplicate pages;
  • Bad behavioral factors;
  • A large number of advertising;
  • Design issues, usability.

There may be several signs:

  • Drop in traffic from Google search;
  • Falling positions, it is important to know that this filter can be applied to part of the site, or maybe to the whole;
  • New pages are indexed slowly.

By analogy with the Penguin, about the output of each new version Pandas are reported by Google employees.

There is only one recommendation for getting out of Panda. You need to conduct a serious check of the site and eliminate any problems that could cause the filter. Pay attention to the quality of the content, make sure that there is no overspam with key phrases, grammatical errors, analyze behavioral factors. Bad PFs usually just signal the poor quality of the resource.

Additional results

Google has supplementaly results - additional results, which are called “snot” in SEO jargon. There are pages that Google does not consider valuable, but indexed them anyway. Additional results will be available to the user only if he clicks on "Show hidden results" at the very bottom of the page. Of course, you should not expect traffic to pages that are included in supplementaly results.

To avoid getting pages in additional results, post only good and interesting content, do not abuse the density of keywords and avoid spam structures in them.

Young sites can be automatically submitted to additional results, and over time they will get into the main results of Google. If the content on them is of interest to users.

-5, -30 and -950

Officially, Google did not say anything about this filter, but it exists. A characteristic feature is the drop in positions in the issue by an even number of positions (5, 30 or 950). They are punished for non-compliance with Google's recommendations for creating and promoting sites, for using cloaking and doorways, JavaScript redirects, and for abusing non-recommended SEO methods, which include overspam of articles with keywords, link spam, etc.

Obviously, the severity of punishment depends on the degree of violations. If they are insignificant, then the site will receive -5, if everything is very bad, then it will be -950. In fact, this filter is an analogue of Panda, the only difference is that Panda can lower only part of the pages, and the drop in requests in the search results will be by a different number of positions.

The way out of this filter is similar. Find and fix the problem, after which you can submit a request to Google to review the site. If all violations are eliminated, then the positions will return.

In fact, this is not a filter, but only a component of the search algorithm, which underestimates the resources that do not have mobile version or adaptive layout. The corresponding warning can be seen in Google Search Console:

If you don't need mobile traffic, Google recommendations can be ignored, but in fact, the proportion of users mobile devices is very large and it is growing steadily, so today on any site you need to at least make an adaptive layout.

What else can Google punish?

There are factors that are difficult to attribute to some kind of filter, but there is a punishment for them. For example, it is known that the appearance of too many pages or a large number of incoming links can be punished. If a site with only 500 pages has 10,000 pages, then Google will consider it very suspicious. The situation is similar with links, although now it is part of the Penguin. Growth should be natural, gradual.

The resource can be punished for a large number of broken links, here they can underestimate positions in the SERP or remove pages from the index. Check for broken links using services, do it regularly. You can use the Link Checker service or its analogues.

About Sandboxes

A sandbox is a filter (or part of an algorithm) that acts on young sites. There is no official data, but the fact that young sites cannot beat older competitors on the move is beyond doubt.

Google can give a young site high positions, but then take them away by placing it in a sandbox. Then he will look at the dynamics of development, if everything is fine, the site will begin to grow in the search results.

In Yandex, a young site almost never can immediately get high positions for competitive queries. FROM low frequency requests, for which there is little competition, the terms are shorter, but even here it can take several months until the positions reach the top. The sandbox in Yandex lasts from 2-3 months to a year, the picture, when after the next update of the search database the traffic to the site increases significantly, is typical for Yandex.

Nothing can be done about the sandboxes of search engines, you can only speed up the time to get out of this conditional filter. The progressive development of the site, the regular addition of new and good content, the natural growth of the link mass can significantly reduce the time it takes to get out of the sandbox.

The tool will help determine if there are sanctions in the form of a spam filter that the Yandex search engine imposed on the site for the specified request. Diagnostics of the presence of a filter will give an understanding of the situation, the answer to the question “why the site’s positions do not grow according to the most frequent request?” will appear.

Why can I get the "Respam" filter from Yandex?

The name "spam" speaks for itself. Most often, this filter can be obtained precisely because of the excessive number of exact occurrences of the query on the page, which needs to be reduced. We also recommend checking the alt and title values ​​of the images, perhaps the request was used there as well. We also recommend checking incoming internal and external links to the page; if the request was used too often as an anchor, this can also serve as a reason for sanctions.

Home distinctive feature of this filter is its query-dependency. This means that when modifying the checked query, the positions should be 20-40 positions better, which is the basis of the tool that can determine

Search engines "clean" the Internet space from low-quality, using "black" methods of SEO promotion of sites. To this end, new search engine algorithms are introduced from time to time, each of which is aimed at a specific violation. Further, sanctions are applied to the site, limiting its work.

Timely diagnostics and removal from under the filter will help to keep the site working and the profit of its owner.

How to know if a site has been filtered

Signs of a site falling under the filter are:


How to check search engine filters

There are special web analytics tools that can help confirm guesses about the filtering:

  • the number of pages in the index of search engines can be checked through Yandex.Webmaster or Google Search Console;
  • you can see changes in traffic after updating the algorithms of Yandex and Google using the SEOLib Sanctions Diagnostic tool, Google has the Fruition tool;
  • the probability of applying filters to a site that does not use analytics systems can be found using the FEInternational service.

Filter types

There are several classifications of filters:

1. According to the search engine to which they belong, Yandex and Google filters are distinguished.

2. Due to the imposition of a filter, a distinction is made between:

  • filters for technical inconsistency of the resource: unadapted resources, lack of useful content, duplicate content, fraud, slow page loading, poor quality design, low level of functionality;
  • filters imposed for the manipulation of SERPs: cheating behavioral factors, overspam of keywords in the text, a large link mass.

3. By way of inclusion:

  • manual — activated by a search engine employee after a violation is detected by a robot, this option is possible if a violation is suspected, a person makes a verdict on imposing a filter;
  • algorithmic - are turned on by machines in case of a clear violation recorded by search engine robots.

Yandex filters

Since a certain time, Yandex does not take into account external links when "distributing" search results. But he has other methods to detect dishonest optimizers. Now he has focused his attention on attempts to cheat behavioral factors.

  • AGS is the most famous Yandex filter. Used to deal with low-quality sites.


Website poor quality
determined by the following criteria:

  • non-unique texts;
  • problems with functionality, design;
  • the presence of outgoing links;
  • aggressive advertising, duplicate pages, cloaking;
  • non-unique page descriptions.

The imposition of this filter on the site can be determined by the following features:

  • new pages are not indexed;
  • the number of visitors is decreasing;
  • most of the pages are out of the index.

To restore the site after the impact of the AGS filter on it, you must take the following measures:

  • bring the site in line with the requirements;
  • remove all SEO links;
  • start regularly uploading only unique texts to the site;
  • revise navigation;
  • remove aggressive ads.

The withdrawal of the site from the ACS can take up to several months.

Filter for reoptimization

This filter includes sites that post texts with overspam of keywords.

What you need to do to avoid the "punishment" of Yandex:

  • texts should be created for people, not for robots;
  • no need to highlight keys with tags.

Filter for cheating behavioral metrics

The filter for cheating behavioral metrics identifies sites that use the services of emulating the actions of resource visitors.

Typically, the overlay of this filter manifests itself as follows:

  • traffic is decreasing;
  • the position of the site in the search results is falling;
  • new pages are indexed very slowly by the system.

To remove this filter, you need to stop manipulating the search results.

"Minusinsk"

This filter is imposed by the Yandex search engine for exceeding the reference mass.

The filter action can be identified by several features:

  • traffic drop;
  • the absence of the site on the first page of the search when requested by domain name;
  • "leaving" several positions down in the search results.
To remove this filter from the site, you will have to remove all incoming links of a manipulative nature.

Affiliate filter

By applying the affiliate filter, Yandex "marks" sites that have three or more sites that attract customers to the same organization. Yandex does not approve of this. They take places in the search results that other organizations could take.

The only sign by which you can determine the effect of this filter is the presence in the Yandex search results of a single site from all created. At the same time, Yandex does not ask which of these resources brings the most profit, and it will not work to choose the most valuable one in order to leave it in the search results.

You can avoid the measure of influence if you create sites that have contacts of the same company, with an offer of different products, then visitors will go to these sites in different ways. search queries, and the search engine will be fine with that.

Nepot filter

The search engine does not recognize that this kind of filter exists. But among site owners, there are speculations about the presence of an algorithm that blocks the transfer of links and suspends the site.

There are 2 ways to protect yourself from a non-pot filter:

  • do not trade links;
  • when inserting a link to a dubious source, hide the link under the nofollow attribute.

Filter for adult content

If Yandex detects the presence of materials "for adults" on the site, it will impose a taboo on site traffic for queries other than "adults".

Here the method is as old and simple as the world - you do not need to use adult content on your site.

Doorway filter

The presence of doorways usually leads to the exclusion of the site from the search engine results.

The best way to protect your site from filtering is to not use doorways.

Google Filters

Google filters are also aimed at improving the quality of sites, the ability of young "good" sites to rank top positions extradition, the fight against violators.

"Panda"

This kind of filter reduces the number of visitors poor quality resources in order to visit quality sites.

Resources that meet several criteria are accepted as quality:

  • Site content quality.
  • Attitude towards site visitors, manifested in the level of functional convenience, usability, design.
  • Absence of someone else's and duplicate content.

You can determine that the site has fallen under the "Panda" filter by the following signs:

  • traffic drop;
  • a decrease in positions in the search results, the positions of all or several pages of the resource may change, some pages may fall into an additional index;
  • slow page indexing.

Changing the marketing strategy will help overcome the barriers that have arisen as a result of the imposition of a filter. Particular attention should be paid to:

  • Content. It should be useful to the audience, of sufficient volume. It needs to be updated regularly. The site should not contain pages without text and duplicate pages.
  • Adjustment specifications site.

"Penguin"

The "Penguin" algorithm was introduced in order to combat excessive link mass:

  • spam;
  • publishing links that are not visible to the site visitor;
  • links to third-party resources;
  • buying backlinks.

This filter can be identified by the following features:

  • slow indexing of new pages;
  • some pages falling into an additional index;
  • site decrease in positions search results;
  • a sharp decrease in the number of visitors to the site.

Google issues a message about the measures taken by the system specialists related to a large number of links.

Remove resource from under filter "Penguin" removal and / or rejection of low-quality links to the site, the publication of informative unique content will help. To reset the weight of links, you can use a special rejection tool.

"Sandbox" and "Domain Age"

Rather, these are parts of the algorithm that impose indexing speed limits on newly created sites. You can ignore them, develop the site in terms of its quality, convenience and information content for the user. Gradually, the site will come out from under the effect of these filters.

-5, -30, -950

Optimizers call a decrease in the position of the issue by 5, 30 points, a decrease in the position to 9 hundred of the issue the filters -5, -30 and -950, respectively, although there is still no official confirmation of the existence of such algorithms from Google.

Such filters "search engine" imposes on sites that violate those. requirements, and using manipulative techniques to attract visitors:

  • spamming by posting links on blogs and forums;
  • methods of promotion through doorways and cloaking;
  • uninteresting text saturated with keywords.

Removing filters and returning to the previous positions in the search results is possible after eliminating violations of Google requirements and re-indexing the site.

Additional results

"Search engine" can place site pages in an additional index:

  • if it "considers" the pages to be uninteresting, non-unique, similar to many others on the topic;
  • if they have errors in the technical execution or the issuance is manipulated;
  • if the site is quite "young".

There is only one sign here, it is obvious - the ranking of pages is not in the main, but in an additional index.

To remove the site from the additional index, you need to bring the site to the level of the technical requirements of the search engine, start publishing informative relevant content, and not manipulate the search results.

"Socitation"

If such a problem occurs, you need to remove or reset the weight of such links to the site. If it becomes obvious that these are "competitor's machinations", you can report the situation to Google technical support.

"Too many links and pages"

Too many links to the site on third-party resources or laying out a large number of pages can cause such consequences as a decrease in the position in the SERP and, as a result, a decrease in site traffic.

To prevent such a development of events, it is necessary to abandon external links, and lay out the pages gradually.

Filter for broken links

In order to prevent search engine exposure measures, it is necessary to regularly check the site with the Link Checker tool and fix the broken links found.

Sites that consider it possible to copy content from other resources may fall under the filter, which will result in the exclusion of the site in whole or in part of its pages from search results.

After removing other people's texts from the site or indicating links to the source, you need to start filling the site with unique materials.

Filter for reoptimization

The system imposes a filter for re-optimization occurs if the site manipulates the search results:

  • in the texts an excessive number of keywords;
  • key phrases are in bold or color.

A sign of the system limitation is a decrease in the position of the site in the search results.

To remove the filter, you must abandon the manipulation and start creating materials on the site for people, and not for search engines.

Filter for slow page loading

If the site pages load slowly, it does not satisfy the site visitors. Google usually deprives such resources of traffic. This happens by reducing the position of the site in the general issue, or in the mobile issue.

Increasing the loading speed of the site will correct the current situation.

Filter for lack of optimization for mobile traffic

The lack of adaptive layout also affects the website promotion in a bad way. A sign of the imposition of this filter will be a drop in the position of the site in the mobile issuance, while the position in the general issuance will remain at the same level.

To avoid such a problem, experts recommend initially making a website with adaptive design.

Filter checklist

Reasons for being filtered

Yandex filters

Google Filters

How to remove the site from the filter

Minusinsk

Removing low-quality links, rejecting links with a special Google tool

Borrowing someone else's content, non-unique texts

Filter for duplicate pages

Filter for duplicate content

Additional results

Removing non-unique texts, creating high-quality content

Low functionality, poor design, awkward navigation

Additional results

Adjustment of the site for greater convenience and user comfort

Spamming by "keys"

Filter for reoptimization

Start focusing not on SEO techniques, but on the informativeness and relevance of texts

Non-compliance with the technical requirements of the search engine

Filter for invalid mobile redirects

Lack of optimization for mobile traffic

Bringing the site in line with the requirements of "search engines".

Selling links

Nepot filter

Stop Selling Links

Cloaking, forum spam, doorways.

Minusinsk

Do not use these methods

Changes in traffic for unknown reasons

Minusinsk

If competitors are to blame, ask for help from the "search engine" support service

Filters are actually not only not scary, but also useful, especially if you focus on them as a result of a free website audit by search engines. The best way actions — identify, correct and continue to work even more efficiently!

Sincerely, Nastya Chekhova

Greetings dear readers and colleagues! Today I will raise a sore point - the imposition of search engine sanctions on sites. In the worst case, they lead to a complete loss of traffic. Therefore, it is important to diagnose and treat them in time. Do you know how it's done?

In the current publication I will describe in detail how to check the site for filters from Yandex or Google. I will also provide a number of recommendations for getting out of them and describe a couple of things in more detail. AGS, Minusinsk, Panda, Penguin - all this, when applied to the site, does not bode well for promotion.

The article turned out to be quite capacious. Therefore, for your convenience, I have compiled the content.

Haven't chosen yet? Then I suggest that you read the material in full. There is nothing superfluous in it.

The main signs of finding a site under the filters

To begin with, I will note the saddest outcome.

Complete loss of traffic from the search and the departure of almost all pages from the index.

It's hard not to notice this. There are also less obvious situations that, with a certain probability, indicate that the site is under the filters of Yandex, Google systems. Attention to the list of signs that the site is under existing filters.

  • TIC drop to 0.
  • Departure of pages from the index search engine.
  • A sharp decrease in traffic for one, a group or all pages without visible reasons.
  • Sticking or lowering positions in the issue, despite all the merits.

How to watch all this? For example, the TIC can be checked at https://yandex.ru/yaca/cy/ch/ by adding the address of your site without http after the last slash. For checking site positions, I personally like the allpositions service. The number of publications in the index is checked in statistics for webmasters from Yandex, Google.

Why do such unpleasant situations appear? It's simple - errors on the site. Many of them can be detected using a set of checks on labrika. His tools helped me a lot at the time.

Whether the site fell under the search engine filter: a simple check

Checking the site for a ban in Yandex and reputation in Google is definitely worth it if one or more of the above signs are noticed. How to do it? For this there is user-friendly interface xtool .

Enter the address and click "verify". useful information there will be a lot. There are quite interesting moments in it.

I checked several resources. As you can see, one of the sites fell under the Nepot filter. The reason is poor-quality outgoing links. If there are about 5 of them, and even more so if they are purchased, and even not thematic, then such a situation is to be expected. To remove the sanction, they will have to be removed, and after the work done, you can additionally write in support of the search engine.

Now I will demonstrate what results can be considered excellent.

Probability of sanctions by zero. This project is developing normally without any interference for good search promotion.

Update from 02/28/17.

There is good news for those who care about website quality, traffic and revenue growth. Throwing fairy tales aside, I made a quality list.

We do a comprehensive check of the site for the presence of filters from Yandex and Google

I will list several fairly serious algorithms that we will diagnose:

  • for PF manipulation (behavioral factors);
  • Minusinsk;
  • Panda (Panda);
  • Penguin (Penguin).

The presence of these filters and some others on the site can adversely affect its development. Let's start diagnosing them. It will require a special algorithm and access to the statistics system.

One project that helps me a lot with my blog is SEOlib. Among its many tools, I would like to draw your attention to a specific one - diagnostics of sanctions. Haven't taken advantage of it yet? Let's take a look in more detail.

For the tool to work, you will need access to the Yandex Metrica analytical system or Google Analytics. I will choose the first option.

Temporarily allow access and move on to the next step.

Now I can start checking the site for Google and Yandex filters. To make it clearer, I will immediately explain what is displayed on the graph.

  • The red and blue ascending lines are the growth dynamics of visits. I did not immediately understand, but then I came to the conclusion that a weekly, not a daily or monthly chart is being built.
  • colored vertical lines note the time when the ban in the search engine could be imposed on the resource being checked.

The color of the vertical line corresponds to algorithms capable of imposing sanctions. In the picture above, the scale is small, so I will show the block with colors larger.

By the way, you can see that for both search engines there is a link at the bottom with the phrase “chronology of algorithms”. It leads to a page with history, which is also extremely useful for studying.

Good with graphic representations figured it out. Who thinks that the green vertical lines in the picture above indicate that the project under review has definitely been sanctioned?

Actually it is not. The fact is that many filters have a common feature - a drop in traffic. And this feature can be used.

If after the vertical line there was a sharp decrease in traffic, then with a high probability we can talk about the presence of a filter on the site. What? Look at the color of the dash.

Along the way, I will show an interesting possibility. There is a comparison table below the graph in question.

You can track the dynamics of changes in organic traffic that goes to specific pages or queries.

It is very convenient for identifying page addresses or key phrases that are losing. If they are found, as in the first line of the example, then it makes sense to determine the cause.

If this is a banal decrease in the frequency of keys, then everything is in order. However, behavioral factors, usability and design, content quality, and much more can affect. It is advisable to understand and eliminate the causes.

We use webmaster tools and draw conclusions

I will go over briefly only on the key points necessary to disclose the current topic of the publication. Let's start with Yandex Webmaster.

  • We check whether the TIC has fallen sharply or whether the pages have started to fly out.
  • We look at the diagnosis of problems on the site.

Now let's go to Google Search Console.

  • We look at scan errors.

  • Go to the section "view in search", "html optimization". No problems should be found.

  • Particular attention to the section "search traffic", "measures taken manually".

  • See "security issues".
  • Checking usability for mobile devices.

It remains to draw conclusions. Are there any errors in any of the paragraphs? The presence of one or more shortcomings does not always guarantee the imposition of a filter, it depends on the type and "severity" of the error. Despite this, such shortcomings certainly will not have a positive effect on the promotion of the resource.

AGS filter: how to check the site and get out of sanctions?

He is one of the most serious. There are references to different varieties of AGS-17, 30, 40. The topic of their differences, of course, is fascinating, but this is not about that now. To begin with, I will highlight the main reason for the presence of the AGS filter from Yandex on the site with all the ensuing consequences.

Uselessness for users.

It may show up to a greater or lesser extent. The essence of this changes little. Let's see how to check?

You can use one of the services. For example, I will check this blog on one of them.

Everything is fine with my project. A check on the AGS showed that there were no sanctions.

What to do with those sites, the check on which revealed its presence? First you need to identify the reasons.

  • Copied content or bad rewriting (rewriting the text in your own words).
  • A large number of pages with a minimum semantic load.
  • Duplicates of content created intentionally or resulting from incorrect site settings.
  • The presence of sections or big number materials that are not related to the main theme of the project.
  • Focusing content, first of all, on search robots and making a profit, and not on usefulness for people.
  • Deception when the title does not match the content.
  • Bad usability.

Since the AGS algorithm is not disclosed, the above points are based on personal observations and the study of the mass of publicly available information. I note that the list may not be limited to these points, but everyone has the same resultant idea.

If the site is not of particular benefit to people, trying to artificially influence the promotion, then it may well get a filter.

Based on the identified reasons, you should also look at how to get out of the AGS from Yandex. Of course, it is desirable to avoid it initially.

Filter Minusinsk

SEO links - a familiar concept? I will briefly explain. They usually lead to the target site and are bought on intermediary resources using selected anchors (link texts). The goal is simple - to artificially influence the ranking (increasing positions) of the site in the search.

For such actions, the site may fall under the Minusinks filter from Yandex. The consequences are a decrease in attendance and a drop in numerous positions in organic issuance.

If this happens, the site can be cured. How to do it? At first glance, nothing complicated - you need to remove SEO links. The difficulty lies in the fact that this is not always possible, since intermediary projects usually belong to other people. There are other reasons depending on particular cases.

How to get out of Yandex and Google filters?

The answer is simple. It is necessary to use the "white" promotion and correct the mistakes of the past.

I will describe possible options exit from search engine filters.

We find and eliminate the cause ourselves.

Now you have powerful information at your disposal - this material. In some cases, it can help to check the selected site in detail for Google and Yandex filters, as well as carry out work to get out of them.

Also, take the time to study in detail the recommendations for webmasters that search engines provide. I just don't understand why so many people don't put such valuable information into practice.

We wait up to several months and get the results.

Bugs fixed. Now we wait until search algorithms react to changes. If you wish, you can also write to the support of the search engine if the waiting time is too long.

If everything was done correctly, then the filter can be removed. It remains only to rejoice. However, the situation is not always so rosy.

The reason has not been identified. The filter cannot be removed. What to do?

Yes, that happens too. If you want to normalize the situation, then there is only one way out - to contact professionals. I have never brought my sites to the state of applying filters, so I can’t advise proven specialists in their removal yet.

By the way, there are 2 powerful information materials. Perhaps, it is for your case that they contain information that will help you adjust your promotion strategy for the better.

  • To help webmasters who care about the quality of their projects, I provide my course on an integrated approach to promotion. Details at .
  • If you decide to work on the design and its convenience for people, then I give one more material. It is located here.

On this publication I will complete. Finally, I will note one thought - if during the site check for all kinds of filters an ACS was revealed, then I would seriously think about large-scale changes on the resource and changing the domain. However, everyone decides how to act in such a situation.

Have questions, additions? Write comments. The topic is quite interesting and relevant. There is something to discuss.

By the way, are you watching my blog promotion show from scratch? Some of thematic publications have been collected.

Is it interesting to earn money on your site or without it? There are many good ways described in previous publications.

I continue to prepare new materials. Subscribe to updates on e-mail or follow the announcements in in social networks. All the best. Until communication.