Crawl-Delay in Robots.txt

Somehow or another, I’ve never noticed the extensions to the robots.txt file. I’ve grown fed up with crawlers that hit my blog 45 times a minute, which can be quite debilitating when five different ones are all hitting me at the same time.

Although it seems like the “official” keepers of the robots.txt psuedo-standard don’t acknowledge it, but most crawlers now respect the Crawl-Delay: tag. I set mine to 30 (seconds) in the robots.txt, which I hope will prevent some of these bastards from slamming the hell out of my poor server.

Published by

dave

Dave Slusher is a blogger, podcaster, computer programmer, author, science fiction fan and father. Member of the Podcast Hall of Fame class of 2022.

One thought on “Crawl-Delay in Robots.txt”

  1. vlkn says:

    Setting a crawl-delay might limit the coverage and freshness of your content representation in search results.

    Set to maximum 10.

Comments are closed.