The search-engine company Yahoo released some useful tips and tricks on how to limit the sometimes aggressive behavior of their spider robot “Slurp”. In the previous months, this particular robot became one of the most annoying and invading robots of my Blogspirit account. I already used the suggested method of including the “crawl delay” option in meta tags, tough the robot penetrates this small line of defense still. Dave Simpson suggests to enable Gzipped Files and Smart Caching to aid the process of restricting the robot’s aggressiveness. As I tested my subdomain with Netcraft, I was informed it’s Apache was using the mod_gzip/1.3.26.1a already. Since most image files for the HTML layout and formatting reside on my own server, the smart caching is little to worry about. Overall, Slurp is already restricted in it’s behavior but still very aggressive. I will try to analyse the whole situation in a near future once I’m done with the final tests.
Themen: Blogosphäre. 12.02.2005, 09:55
Stichwörter: Bot, Robot, Search Engine, Suchmaschine, Tipp, Yahoo