What is Robots.txt?

A robots.txt file tells search engine crawlers which URLs the crawler can or cannot access on a website. It’s mainly used to prevent search engines from wasting crawl budgets on unimportant pages. The robots.txt file doesn’t, however, prevent webpages from being indexed by search engines.


Denmark Office

USA Office

2013 – 2023 © AccuRanker. All Rights Reserved.