OhMyApps
Back to Tools

Robots.txt Generator

Generate robots.txt files for search engines

Rules
Presets
robots.txt

How it works

Build a robots.txt file to control crawler access:

  • User-agent — which crawler the rule applies to (* = all)
  • Allow/Disallow — permit or block specific paths
  • Sitemap — tell crawlers where your sitemap is

Want to learn more?

Read the complete guide with examples and tips

Read guide