Changes in processing robots.txt file

Request a Quote

Services you are interested in:

Yandex has made some changes concerning search engine optimization.

Up to this moment, the 'allow' directive with no path specified was considered as a block, so a search engine robot skipped it. Consequently, this site was excluded from search engine results because it was inaccessible to a robot. However, those site owners who did not want to allow access and indexing did not specify anything in this directive deliberately.

Now a robot will ignore a blank «Allow» directive. If you want to block anything, use the 'disallow' directive like this: 'Disallow: *' or 'Disallow: /'.

You can access all necessary information on using robots.txt for search engine optimization, such as syntax, opportunities and bans on webmaster.yandex.ru.

Back to blog

We use cookies to understand how you use our site and to improve your experience, this includes personalizing content and advertising.
By continuing to user our site, you are agreeing to our Privacy and Cookie Policy. Read our Privacy Policy to learn more. OK