Robots.txt Generator

Web Tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

This robot.txt generator tool is free and easy to create a robots text file that is used by search engines.

What is Robot.txt (text) file

The robots text file, also known as the robots exclusion protocol or standard, is a text file that tells web robots or crawlers and most often search engines like Google, Bing or Yahoo which pages on your site to crawl. It also tells web robots which pages not crawl.

How to Use

Select the All Robot allowed or not allowed options.

Select the crawl-delay options.

Enter the sitemap of your website.

Select the search robot options

Enter to restricted the directory.

Click the Create Robot.txt button. This will generate the report into the output window.

Alternatively, click Create and Save as Robots.txt file.

Review the report or file.

Create a 'robots.txt' file at your root directory.

Copy the above text and paste it into the text file.