Changes in processing robots.txt file

Yandex has made some changes concerning search engine optimization.

Up to this moment, the 'allow' directive with no path specified was considered as a block, so a search engine robot skipped it. Consequently, this site was excluded from search engine results because it was inaccessible to a robot. However, those site owners who did not want to allow access and indexing did not specify anything in this directive deliberately.

Now a robot will ignore a blank «Allow» directive. If you want to block anything, use the 'disallow' directive like this: 'Disallow: *' or 'Disallow: /'.

You can access all necessary information on using robots.txt for search engine optimization, such as syntax, opportunities and bans on webmaster.yandex.ru.

Back to blog

Get a quote

Contact us, we speak English and are ready to
answer all your questions!

Our Books: Russian SEO in 2021-2022: Trends and Features of Russian Search
We wrote a book about it

Russian SEO in 2021-2022: Trends and Features of Russian Search

by Anton Trebunskii, Valeria Morgacheva

Nowadays, good online search visibility is an essential element of a successful business, especially one that deals in foreign countries.

We’ve created a whitepaper where we go through both SEO trends in general and consider some specific factors of Russian search as well.

Read more