Robots.txt Generator

Home / Tools / Robots.txt Generator

Manage Settings:

Allow All
Disallow All
Custom

Select Bot:

Disallowed Directories:

Allowed Directories:

Enter Sitemap URL:

Robots.txt Preview:

How the Robots.txt Generator works?

Robots.txt Generator can be used to easily create a robots.txt file. You can set custom rules and quickly generate ready-to-use file that can be uploaded to the server.

You don't have to be a developer to use the Robots.txt generator. There are a few simple steps to follow and you're done! In less than a minute, you'll have a customized robots.txt file that can be saved and used on your site right away. It's perfect for both beginners and experts alike.

Create robots.txt file in 3 steps:

  1. Fill in directories that you want to be disallowed/allowed for selected Bot.
  2. Enter the full URL of your sitemap.xml file (leave empty if you don't have sitemap yet).
  3. Click "Save to file" button and download your new robots.txt file!

Robots.txt best practices

If you are looking for more information on how to customize your robots.txt file, make sure to read the Mastering Google's Crawl chapter from our Textbook. Also, you can check out the Google's documentation on creating robots.txt file.