What is Robots.txt?

A robots.txt file tells search engine crawlers which URLs the crawler can or cannot access on a website. It’s mainly used to prevent search engines from wasting crawl budgets on unimportant pages. The robots.txt file doesn’t, however, prevent webpages from being indexed by search engines.

Grumpy tiger

Join thousands of SEO professionals achieving more with AccuRanker