Ags 17 Yandex filter. Yandex filters. Automatic filter detection

There's a lot today sites from Yandex fall under the AGS filter 17 or 30 . At the same time, many create websites and cannot understand why such a filter is applied. I will tell you what methods you can not only deal with the filter, but also prevent its appearance.

Let's start with an excursion into history. This filter is over 3 years old. Previously, it also worked, but the mechanisms were different, very few sites fell under the filters. In 2009, when the number of sites under SAPE increased greatly, the updated AGS 17 appeared. It had more than 100 parameters for identifying low-quality sites. As a rule, after using it, there were 1-11 pages left on the site. Another characteristic feature was that category pages remained in the index.

In 2010 AGS 30 appeared . A more improved mechanism for searching for low-quality sites (the number of parameters has increased, the filter has become self-learning) demolished tens of thousands of sites.

Let's figure it out reasons for applying the AGS 17 -30 filter

1) Low quality, not unique content. Small amount of text on the page

2) Duplicate content within the site

3) A large number of pages created for PS, not people

4) A large number of external links from the site

5)Standard website templates

Let's look at each point separately.

What about the content? The AGS filter was designed primarily to combat non-unique content. At this time, content parsing was very relevant, when using RSS parsing, websites with thousands of pages were created and shoved into the glanders. The impact fell primarily on such sites. Many sites then also contained little content - for example, generated stores with a picture and a small description (of course, not unique). Parsers Market, MarketGida, Torg.Mail, Ozona are the most relevant things for creating a GS for glanders. I will also say that they paid for themselves very quickly and the demand was very good.

Regarding duplication within the site. Then, to increase the number of pages, labels, clouds, and so on were created in SAP. Thus, a site with 100 posts could have up to 500-600 pages. It was with pages like these that the struggle began. First of all, sites on well-known cms - WordPress, Joomla, DLE - were affected.
They have the largest number of duplicates among the cms known to me. After the filter began to be applied, SEO specialists finally began to pay attention to robots.txt and began to close duplicates, profiles and other pages generated by the CMS.

The filter also affected quite a few sites with selling links. As a rule, all such sites were filtered in the search and the only income from them was the sale of links. AGS mowed down primarily such sites, especially sites that sold a lot of links from the page.
Well, about templates - the Yandex search engine can well determine both the structure and cms. Website templates are no exception. Based on similar criteria, sites with standard designs often fell under filters.

IN this moment the number of pages in the index increases greatly. If previously there were a maximum of 11 pages left, now I saw that the AGS leaves both 22 and a maximum of 56 pages. Therefore, talking about any specific number is pointless. As a rule, this number depends on factors such as uniqueness and the number of categories.

How to deal with the AGS filter? What to do. if the filter has already been applied by Yandex?

a) First, look at your robots.txt and close all duplicates. Take a closer look at the structure. Do you have a very large section of text duplicated across the entire site? Try to remove it or close it

c) Try to change the structure of the site. Enable CNC on the site.

d) If you have a lot of links to the main page of the site, buy links to the internal pages of the site. Buy some good trust links.

e) Add unique content, this has a very good effect on the conditions of exit from the AGS filter.

f) If you have a GS and not an SDL, I do not recommend writing in support of Yandex or platons. After verification by the assessor, you will receive a standard response and, in fact, you yourself will cease to exist of your site.

As a rule, it takes up to 2-3 months to come out of the filter. At the moment, I personally managed to withdraw 4 out of 5 sites that fell under AGS 30.

If the site does not come out of the filter after 2-3 months, then demolish the specified site and upload a thematic beautiful site there, preferably a pseudo company and write to Plato. The chance of exit increases significantly.

Thus, from my own experience I can say that AGS 30 can and should be fought.


AGS for today's webmasters is not only a type of firearm. This is a real threat for any site to become persona non grata in the secular society of the Yandex search engine. It should be noted that the AGS filter may include both absolutely useless resources created to sell links and inflate indicators, as well as portals created solely for the benefit of people.

  • What is an AGS filter?
  • How to avoid his attention to your resources?
  • Is there life after AGS?

You can try to find answers to all these questions in this article.

How the AGS appeared and developed

According to responsible employees of the technical department of the Yandex search engine, the AGS filter was introduced more than two years ago. However, web masters felt its destructive impact only in September 2009. Therefore, we cannot speak with certainty about the time of appearance of the AGS filter. Let's start counting from the moments of the first manifestations of his work. In the fall of 2009, the AGS-17 version was launched. This filter involved an automated, without human intervention, calculation of about a hundred site parameters, which together indicated that it belonged to resources that were not useful to users.

Literally a few months later the mechanism was improved and the AGS-30 came onto the scene. These decisive measures were taken in connection with the adaptation of the Internet community of webmasters to existing conditions, the calculation of AGS-17 algorithms and active countermeasures.

It should be noted that, from practical observations, we can conclude that the AGS filter is not a permanent mechanism. Most likely, it runs periodically, according to a certain schedule. This imposes additional complexity when removing sites from the filter.

Benefits of Yandex from using the AGS filter

Many users ask a righteous question:

Why does the Yandex search engine need the AGS filter?

After its launch, search results did not improve significantly. Owners of websites for people did not receive any practical benefits. Many of them complain that most pages fall under the filter and fall out of the index.

The answer, as always, lies on the surface. The Yandex search engine, due to its growing popularity, desperately needs to periodically unload existing physical capacities. It should be understood that each Internet page for a search engine is space on the server’s physical disk, time spent crawling by a robot, and many other costly items.

Therefore, Yandex needs the AGS filter, first of all, in order to relieve its technical capacity. During the filter's operation, several million pages were removed from the search database. This has made the job of search engine robots easier. There is no need to talk about greater relevance of search results.

Candidates for the AGS filter award

Today, no one can say with certainty that your site will never fall under the AGS filter. It should be noted that there are no clear parameters by which the AGS selects sites that, in its version, do not deserve the attention of users. There are only approximate indicators by which we can say with some degree of confidence that this or that page is a contender for the AGS “award.”

    What you should pay attention to on your resource:
  • duplicate content;
  • non-unique content;
  • a large number of links from the page;
  • content that has no meaning;
  • generated content;
  • materials with fewer than 1500 characters.
    These are the main parameters that trigger the AGS filter in a certain combination. The main declared task of the AGS filter is to clear search results from pages that are of no practical use to visitors. Based on this statement, we can make assumptions that the following resources should mainly fall under the AGS filter:
  • having non-unique content;
  • pages with small text materials;
  • sites with a large amount of non-thematic content, including links leading to pages with non-thematic content.

It is not uncommon for portals that do not post sales links and non-unique, low-quality content to fall under this type of filter. The reason for this phenomenon is the active theft of materials and their replication on the Internet.

    Often the AGS filter works for sites with:
  • excessive linking of pages,
  • presence of unnecessary navigation blocks,
  • duplication of pages within one resource.

Five decisive factors for the operation of the AGS filter

There are more than a hundred factors that influence the operation of the AGS filter. However, all of them can be present to one degree or another on almost any resource. And this does not always lead to the majority of pages falling out of the index. The explanation for this phenomenon is quite simple. The AGS filter works on the principle of a cumulative points system. For each non-compliance of the site with the standards adopted by the Yandex search engine, a certain number of points are awarded. When the threshold value is reached, the resource is partially excluded from the index.

    Thanks to practical research, it was possible to identify a close relationship between the presence of a certain set of factors that inevitably lead to the operation of the AGS filter:
  • the amount of non-unique content on the pages exceeds 40 percent, with no links to the source;
  • the total number of outgoing links from site pages is 20 percent higher than the number of indexed pages;
  • more than 30 percent of outgoing links are non-thematic, i.e. do not correspond to the content of the materials located on the page with links;
  • the text on the page is smaller in volume than the navigation surrounding it;
  • no specific thematic focus of the text component of the site or part of its pages was detected.

These five factors, taken together, in the vast majority of cases lead to the fact that more than 90 percent of the site’s pages are no longer taken into account by the Yandex search robot and are no longer included in search results.

Practice of removing sites from the AGS filter

There is a certain practice of removing sites from the AGS filter. If you are convinced that the robot’s action in relation to your resource was carried out incorrectly, then we can recommend that you contact the technical service of the Yandex search engine with a request to review the results. They will definitely answer you. But it is not a fact that the answer will be informative and contain information satisfactory to you. The main problem of inexperienced webmasters is the inability to correctly determine the presence of AGS filter sanctions in relation to their sites. Although this is not always easy for an experienced optimizer to do.

    When applied AGS filter:
  • from 1 to 10 site pages are involved in the search, regardless of their actual number;
  • dropped pages are not prohibited from being added to a special form for indexing;
  • pages under the filter are regularly visited by Yandex search robots.
    These are the main characteristics of the definition of an AGS filter. Now let’s try to figure out whether the site that came under fire has a prospect of survival. In our practice, there are a sufficient number of projects successfully removed from the AGS filter. The most important thing is to define:
  • practical value of the resource,
  • his domain name,
  • existing indicators,
  • current state.
    It should be understood that removing a site from the AGS filter is time, mental and financial. All myths that:
  • you just have to provoke the server to issue an erroneous response;
  • move the site to a new one Domain name;
  • run through social bookmarks;
  • perform many other fantastic actions, and the AGS filter will be removed, and they remain fairy tales.

There is only one effective way to eliminate the consequences of sanctions imposed by the Yandex search engine. This is bringing the site into strict (or almost strict) compliance with Yandex’s idea of ​​a useful resource for the user.

    Therefore, work on removing the site from the AGS filter include:
  • detailed audit and identification of all the reasons that caused the filter to operate;
  • eliminating these shortcomings;
  • improving the site structure if possible;
  • adding new materials;
  • increase in link mass;
  • negotiations with Yandex technical service to speed up re-indexing;
  • waiting for results, which can take from several weeks to several months.

Hello friends. This is probably the end of my webmaster career. Today I went into Yandex Webmaster statistics and found out 698 pages in the index only 18 .

As I understand it, my site fell under the Yandex filter? What should I do, how can I get him out of there? Save, help?

The most interesting thing is that I don’t have a single page that is copy-pasted, everything is pure unique content. My real SDL blog, but here it’s such a bummer. 🙁

I want to believe that this is some kind of glitch from Yandex, but I think it’s unlikely.

What I did wrong on the site:

- sold 80 links in the exchange;
- created the website;
- many identical pages (did not create a file);
- I leave links to my website in various profiles (base).

You know what the paradox is, but the fact is that I have a website for the people, so there is only one continuous copy-paste, and Yandex has not dropped out of the index for a year!

It's no secret that any site that appears in search pages is optimized for a specific search engine.

Since domestic information resources are developed mainly for Yandex, due to its high degree of adaptation to the Russian language, Yandex filters and methods for bypassing them will be discussed below in this article.

Experts have identified 4 types of filters:

Pre-filters (valid before site relevance is calculated);
Post-filters (effective after relevance calculation);
Filtering before issuance (relevance is calculated, but the site falls out of the issuance for several requests);
Ban (complete blocking of a resource in search).

The detailed classification is as follows:

First. The filter for over-optimization is designed to rid the search results of pages stuffed with keywords that in no way correspond to the content of the resource.

In order to bypass this filter, you need to eliminate spam structures, remove unnecessary h1-h5 and links from internal pages To main. And be sure to analyze the “key-content” link.

Second. AGS - removes sites with low-quality content from the Yandex index, or at most, leaving a few pages in it.

The reason why a resource falls under the jurisdiction of this filter may be hidden behind a free domain, hosting or site engine.

Getting rid of its influence is not so easy. The best insurance is unique thematic content and good .

Third. Affiliation. Designed to be excluded from search results information resources one owner for the same requests.

If this is an error, and the site owner does not have other resources with identical requests, then you need to write directly to Yandex support and explain the situation.

If there is only one owner, then both resources should be completely shaken up, both externally (design) and internally (diversify the range and articles). The best way– change the domain or owner.

Fourth. A filter for identical fragments, the purpose of which is to prevent sites with identical snippets from appearing in search results. The most unique content, i.e., will help you avoid having your resource fall under this filter. no copying of descriptions of goods, services, etc.

Fifth. Filter for adult content. Necessary for not being included in the results of adult sites for regular queries. As a treatment, the removal of all elements related to Aldeit is prescribed. Then you should write to Yandex technical support.

Sixth. Temporary filter for commercial filters. Serves to reduce optimized links in ranking.

Seventh. Pessimization of the site for “bad links”. Serves to eliminate low-quality sites from the search results that are promoted through links. It is recommended to delete all links at once, then slowly work to restore them.

P.S. I successfully won the Yandex AGS filter, read the detailed report in the article “”, although it took me 11 months to do this.

The strange, incomprehensible, frightening abbreviation AGS, found in SEO blogs and forums with indexes 17 and 30, is just one of the components of the Yandex search algorithm, activated and successfully used recently.

This is an independent, self-adjusting, self-learning process that plays by its own rules, which many optimizers compare to a virus, saying that their sites are “infected with AGS.”

What's happening?

At one not so wonderful moment, you discover that in the index of the Yandex search engine there are 1-15 pages left of your resource (and there were from hundreds to several thousand!). As a result, the positions confidently went down and disappeared somewhere in the abyss of the third hundred issues. To cries for help, Yandex support gives a standard answer - “don’t worry, because the resource is in the search, and the fact that not all pages are available is a matter of time. Improve it, and the robot will add pages to the index.”

It doesn’t look like a ban—the site didn’t crash completely, and Plato’s letter isn’t particularly harsh. Congratulations, you have fallen under the AGS.
How does it work?

AGS pays attention to the quality of content; the determining factor here is the uniqueness of the text. Bad content is already a good reason for the ACS to throw out “low-quality” pages from the index. If the ACS finds shortcomings in the content, it begins to work out the Internet platform according to other criteria. For example, the AGS takes the fact of the presence of sales links calmly, but pays great attention to how sales SEO links influence search engine results. AGS will throw out your links if they distort the results from the point of view of Yandex. As mentioned above, AGS does not look for the presence or number of “bad” links, but solely for their influence. And this is a malfunction of the AGS. Because precisely because of this, if you received a “black mark” only because of a bad impact on the search results, then any of your actions, even if you remove these links, will be useless. And you will be forced to simply passively wait for the AGS to “replace anger with mercy” and the disgrace itself will pass; unfortunately, this may take many months. True, then your site will definitely return to the index, but after a while it will be blocked again by the AGS - for some other reason - having infected it once, it will no longer forget about it, and will find fault again and again.

It is important to know

AGS (modifications: AGS-17 and AGS-30) - a filter that excludes sites from the search engine index, as well as individual pages created for search spam, as a rule, applicable to satellites. According to the developers, the filter takes into account more than 100 parameters that are used to apply the filter.

Removing the site from the filter

The cost of this service depends on which filter your site falls under: manual or automatic. The greatest effort requires eliminating the Penguin and Panda sanctions of Google, as well as the filter for over-optimization and the Yandex AGS. In addition, pricing is affected by the overall “weight” of your site: the larger the site, the more effort it will take to return it.

Cost from 55,000 rubles

How to stay safe?

The Yandex website specifically for webmasters lists almost all preventive measures against AHS infection. Let's derive a few basic rules.

Use unique content (order from exchanges, write it yourself, rewrite feature articles from other resources beyond recognition - the main thing is that it is readable);

Use a complex tree structure of the resource (if you have a satellite with a large number of categories and pages, even some people may mistake it for SDL);

Under no circumstances make links from the main page to all pages (better - to categories from which the links spread further);

The length of the article is from 600-1000 characters;

Do not use pagination (that is, “non-speaking” URLs with numbers) of pages;

Fewer virtual pages - if the page ends in .html (.htm), the robot will think that it is HTML, and not virtual, but that you may have a plugin like ModRewrite, which converts standard engine addresses into human-readable ones, this is already deeply parallel to the robot.

Use robots.txt wisely;

Do internal linking of pages (contextual links from one page to another, announcements of posts (if a blog), links to old posts and “interesting” - what may interest the reader of a particular page);

First of all, calm down. There is not a single problem for which the resourceful mind of an optimizer cannot find a solution.

  • So, the first method of treatment is called “soft”, and although it will take a lot of time, it is absolutely safe and will not cause any harm to the resource. Its meaning is to wait until it returns to the index, and immediately post a large amount of new, unique, high-quality content, which would make up at least 10% of the volume of all articles, and even better if there is more of it. After such treatment, AGS is usually left alone and does not return to it. But it should be noted that this soft method only works if it is applied precisely at the moment of returning to the index, and does not lose its power only for a very limited time. It is necessary to carry out such treatment precisely when the site begins to return. If you do this later, the effect will decrease significantly; earlier it will be even more useless. After all, we have already said that the indexer robot and the AGS are mutually independent and work completely inconsistently. Therefore, if you decide to carry out this treatment before returning to the index, you will get a zero result, since even if the indexer regularly finds new pages, the filter imposed by the AGS will still function all the time until the quarantine period ends.
  • The second method of treatment is considered “radical”, as it helps to cure your site in the shortest possible time, but at the same time, be prepared to suffer some losses. This method of treatment consists of changing the addresses of all pages at once. Since you are outside the index, this radical measure will in no way harm it in the eyes of Yandex. After this, you are guaranteed to return to the index after several updates of the search database, this is about a month. Ask your programmer how to change the addresses of all pages. But before these manipulations, it is simply necessary to eliminate everything that led him to become infected with AGS. Well, when rehabilitation in the index begins, it is necessary to treat him using the gentle method described above. Unfortunately, this radical method can only be used when you need indexing specifically in Yandex, since other servers and search engines, such as, for example, Google and Yahho! They may simply not understand why the address has changed. In addition, previously external links led to site pages, but now these same links will lead visitors to nowhere.
  • Another method of treating an infected person is even more radical and is used in cases where you are unable to change the addresses of your pages or you need indexing in all search engines. Its purpose is to simulate a technical error on the Internet site, this in turn will lead to the fact that your web server, along with the page, will transmit some kind of malfunction code to the browser, for example, 404 Not Found, rather than the standard response code (200 OK!). In accordance with their specifics, browsers will begin to show these pages to site guests with greater caution, but they will not even realize that the server response code is erroneous. But search engines, having accepted such a response code, will stop loading your page and flag an error. In the end, after some time the number of loaded pages will be zero (you can see this in the webmaster panel), and the AGS will gradually fall behind. After this, you can safely return the code 200 OK! Thus, everything will be indexed from scratch in a new way. This method is dangerous because the site will fall out of the Google and Yahho! index, even if only for a short time. Secondly, Yandex has another robot that, based on the criterion of unavailability, “closes” resources. Of course, this is a fairly leisurely robot, and in order to become prohibited by this criterion, the resource must remain unavailable for quite a long time. But, nevertheless, the danger of falling under a complete ban exists.

But you have to understand! If all or most of them start using this method of treating their sites, all search engines will most likely learn to detect this deception (and it is a deception) and adjust their algorithms. Based on the above, we advise using this method only in extreme cases.

The site lives, develops little by little, and then – bam! The attendance chart went down sharply, the positions won with sweat and blood flew far beyond the TOP 10. There may be several reasons, but today we will talk about only one of them - AGS filter.

First, let's figure out what AGS is and how it stands for. As is known, in the SEO environment, sites with poor design, over-optimized and/or non-unique content (read about the importance of unique texts), hidden text, a lot of advertising and external links, as well as other signs of low quality and uselessness are called the cacophonous word “shit sites” . So the automatic Yandex filter, aimed at removing such resources from search results, was called “Anti-GovnoSite”, or AGS for short.

Development of the AGS filter (infographic below)

Optimizers first learned about AGS in 2009, although this filter began to operate even earlier. Today, three of its versions have replaced each other in turn:

  • AGS-17. It operated until 2009, and this became known after its replacement with the AGS-30. The filter was applied to sites that sold and bought links in large quantities, to resources with stolen or generated content, with template design and low-quality layout. It also included sites on recently registered domains.
  • AGS-30. This filter was used by Yandex from 2009 to 2013. It was an improved version of the AGS-17 with a significantly expanded base of factors (about 100).
  • AGS-40. Operates from 2013 to the present day. Yandex's use of the AGS-40 filter is aimed at solving the same goal as the use of its predecessors - excluding sites of little use from the results. However, it differs from AGS-17 and AGS-30 in a more advanced method for assessing resources and an even wider list of factors for its implementation.

Since the Yandex AGS-40 filter is relevant today, in the future we will talk about it.

Who is at risk?

There are more than a hundred factors for assessing sites for imposing AGS on them, full list known only to Yandex itself. But there are a number of key characteristics, the presence of which sharply increases the likelihood of a resource receiving the GS label. Let's briefly list them:

  • young domain name;
  • parking at free hosting or using a poorly hacked paid CMS;
  • low attendance (several people per day) and the absence of a core of permanent audience. There is an opinion that sites with big amount unique visitors per day are less susceptible to the AGS filter;
  • an excessive number of external and internal links, including excessively active linking, designed only for search robots;
  • placement of paid links and aggressive advertising (pop-up banners, teasers with shock content, etc.);
  • low-quality content (non-unique, generated, duplicated, of little use to users and written solely for the purpose of sharpening keywords etc.);
  • resources created solely for earning money or promoting other resources.

Of course, these are not all the reasons why a site may fall under the AGS, but they are the ones that most often lead to the application of a filter.

How to check the AGS filter?

You have a suspicion that your site has fallen under the AGS. How to confirm or dispel these doubts? First you need to check the site for indexing. To do this, enter the Yandex.Webmaster panel and select the “Site Indexing” tab, and then the “History” section. If the indexation graph goes down, this is a serious reason to suspect that Yandex is applying the AGS filter and continue checking. And if there is only one left in the index home page and several internal pages are a direct indication of sanctions from the search engine.

Since April 21, 2014, AGS-40 does not remove sites from the search results, but resets the TIC.

The first impulse of any webmaster who suspects that his site has fallen under the AGS is to write to Yandex support service. Of course, this can be done. But in most cases, the answer will be standard: “Develop your site, and then it will appear in the search results.” It would be much more effective to check the AGS filter using special tools, for example, the Xtool.ru service.

If, when checking a site, a “red light” lights up in the required column, you need to take measures to remove the site from Yandex sanctions.

But it’s more convenient to analyze sites with some more serious tool that is always at hand - I’m talking about a free plugin for Google Chrome and Mazilla FireFox RDS Bar. This plugin will show if the site’s TIC has been reset, as well as a lot more useful information about PageRank, indexing, number of links, presence in directories and much more.

If you want to check a large selection of resources, you can use the free tool "AGS Checker".

How to remove a site from the AGS filter?

Let’s say right away that removing a website from the AGS is not an easy task even for real professionals. Sometimes it is possible to return pages to the index within a few months, and sometimes it does not take several years. Some SEO specialists even believe that a site that has fallen under the filter should not be revived, but that it is better to create a new one instead.

We do not think that the idea of ​​removing the site from the AGS is futile. It's at least worth a try. To do this, use the following instructions:

Step 1 – get the site in order. To get away from the filter, you need to eliminate the problems that caused it to be applied. First, you need to audit the site yourself or with the help of professionals and identify and then eliminate the most glaring errors such as placement on free hosting, duplicate pages, huge clouds of tags, non-unique design, etc. After that, you should focus on making your resource is useful. Here are some tips on how to achieve this:

  • review the structure and concept of the site. An unclear structure makes any project, even the most unique and interesting, useless. What's the point of original articles and talented photos if the user simply cannot find them among dozens and hundreds of diverse materials? Therefore, to escape the filter, you must structure your site and sort the content into separate categories. If some categories contain only 1-2 materials, it is better to delete them altogether - Yandex loves and considers narrowly themed sites useful, and not “all about everything” resources with dozens of half-empty subsections;
  • add useful services to the site. Text content, photos, videos are, of course, good. But this is not enough for a modern user - he wants not only to see a table with prices for goods, but also to calculate the cost of ordering them to his region, not only to get acquainted with the list of services, but also to clarify the procedure for their provision from a consultant. Therefore, the site should have various technical services: a calculator, online chat, search and product comparison forms, etc.;
  • work with existing content. Yandex, like Internet users, does not like huge “sheets” of texts, poor quality photos, videos that don’t work, and “crooked” tables. Therefore, review all your content for a clear structure, operability, and ease of perception. At the same time, check its uniqueness - even if you personally wrote or ordered texts, took photos and videos, it is not a fact that they were not stolen from you during the existence of the site;
  • work with behavioral factors . Let’s say right away that it’s not worth ordering PF cheating on different exchanges to appear to improve the resource; this, on the contrary, will lead to the imposition of new sanctions (the filter for PF cheating is no less severe). It would be much more correct to analyze data from the Yandex.Metrica Webvisor to identify weak points of the site, order usability testing and make improvements based on its results.

After working on the site, analyze it again. If it seems to you that all the problems have been resolved, you can move on to the second part of the plan to return to Yandex search results.

Step 2 – write to Yandex. A letter to the search engine support service should describe why you think that the AGS filter should be removed from your site. List in detail all improvements made to optimize the site, indicate the presence of user activity, new useful services, etc. For greater clarity, you can attach screenshots from statistics systems to the letter. It doesn’t cost anything to embellish – assessors will check the site manually and will easily see through any trick.

How long should I wait for a response from Yandex? Everything here is individual - for some it comes right away, for others it waits a month. You shouldn’t immediately get ready to return your site to the index - this process definitely won’t be quick. With a high degree of probability, the “platons” will again send a standard reply: “Work on the site...”. In this case, you will have to dig through the entire resource again and again write a letter to Yandex. And so on until the bitter end.

Don’t have the strength, time and desire to fight for so long for a site that has the Yandex AGS filter applied? Then it is better to register a new domain and transfer the resource idea to it. But first you need to work on the quality of the site, otherwise they with a new name will quickly be blacklisted by Yandex.