What is Robots.txt?
Robots.txt is a text file webmasters create to instruct web robots (typically search engine crawlers) how to crawl pages on their website.
It's placed at the root of a website and specifies which parts of the site should or shouldn't be crawled.
How to Use Robots.txt
- Specify crawl rules for different crawlers.
- Prevent certain pages or directories from being indexed.
- Allow or disallow crawling of specific files or directories.
Example Robots.txt
User-agent: * Disallow: /private/ Allow: /public/