Robots.txt generator

By default all robots are

Crawl delay

Sitemap URL. Leave blank for none.

Robots are allowed to crawl your site:

Restricted directories or files. Relative to root, with trailing slash for dirs.

Your Robots.txt

The robots.txt file on a website is help the webmaster ignore specified files or directories when secrch engine spider crawling a site.

Robots.txt generator help you get robost.txt file easier, wehn you select the search engine allowed and crawl delay time, enter the sitemap URL and select the rotots allowed then get your code. the code like below:

User-agent: *
Disallow: /

User-agent: ia_archiver
Disallow: 
User-agent: Slurp
Disallow: 
User-agent: Aport
Disallow: 
User-agent: teoma
Disallow: 
User-agent: baiduspider
Disallow: 
User-agent: twiceler
Disallow: 
User-agent: robozilla
Disallow: 
User-agent: gigabot
Disallow: 
User-agent: Googlebot
Disallow: 
User-agent: googlebot-image
Disallow: 
User-agent: googlebot-mobile
Disallow: 
User-agent: Mail.Ru
Disallow: 
User-agent: msnbot
Disallow: 
User-agent: psbot
Disallow: 
User-agent: naverbot
Disallow: 
User-agent: yeti
Disallow: 
User-agent: nutch
Disallow: 
User-agent: StackRambler
Disallow: 
User-agent: scrubby
Disallow: 
User-agent: asterias
Disallow: 
User-agent: yahoo-slurp
Disallow: 
User-agent: Yandex
Disallow: 
Thinkcalculator.com provides you helpful and handy calculator resources.
Share