This robot.txt generator tool is free and easy to create a robots text file that is used by search engines.
The robots text file, also known as the robots exclusion protocol or standard, is a text file that tells web robots or crawlers and most often search engines like Google, Bing or Yahoo which pages on your site to crawl. It also tells web robots which pages not crawl.
Select the All Robot allowed or not allowed options.
Select the crawl-delay options.
Enter the sitemap of your website.
Select the search robot options
Enter to restricted the directory.
Click the Create Robot.txt button. This will generate the report into the output window.
Alternatively, click Create and Save as Robots.txt file.
Review the report or file.
Create a 'robots.txt' file at your root directory.
Copy the above text and paste it into the text file.