Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt file is a file placed in your website's root folder to attract ranking favor from search engines. Google and other search engines make use of website robots or crawlers to review your web contents. You may not want the crawl to include some parts of your website in the search results. Such parts like your admin page can be added to the file to be completely ignored. 

When crawling through a website, search engines first check out the robots.txt file, recognizes the blocked files and directories.

Why Robots.txt generator

A lot of webmasters can attest to how much impact the robots.txt generator tool has had in their career. This free tool generates the files in no time and it’s very easy to use. The tool gives you the option of adding or removing the content of the robots.txt file.

With this amazing tool, you can create robots.txt file with these few steps:

  • Select the robots you wish to allow or deny access to your site
  • Select between 5 to 120 seconds delay duration the specific crawl-delay. This option determines the delay duration in the crawls.
  • Paste your website’s sitemap in the box if you already have. Otherwise, leave it blank
  • Select the search robot you prefer to craw your site from the search robot list provided.  Remember you can block any robot you don’t want on your site.
  • Restrict directories
  • After generating the robots.txt file, upload to your website's root directory.

Still not convinced how it works? Feel free to generate a sample robots.txt and see how it works before trying it out on your site