Changes in processing robots.txt file
Request a Quote
Yandex has made some changes concerning search engine optimization.
Up to this moment, the 'allow' directive with no path specified was considered as a block, so a search engine robot skipped it. Consequently, this site was excluded from search engine results because it was inaccessible to a robot. However, those site owners who did not want to allow access and indexing did not specify anything in this directive deliberately.
Now a robot will ignore a blank «Allow» directive. If you want to block anything, use the 'disallow' directive like this: 'Disallow: *' or 'Disallow: /'.
You can access all necessary information on using robots.txt for search engine optimization, such as syntax, opportunities and bans on webmaster.yandex.ru.