Understanding Robots.txt

What is Robots.txt?

Robots.txt is a text file webmasters create to instruct web robots (typically search engine crawlers) how to crawl pages on their website.

It's placed at the root of a website and specifies which parts of the site should or shouldn't be crawled.

How to Use Robots.txt

Example Robots.txt

User-agent: *
Disallow: /private/
Allow: /public/