Yahoo’s Slurp-Bot: Official Tips

The search-engine company Yahoo released some useful tips and tricks on how to limit the sometimes aggressive behavior of their spider robot “Slurp”. In the previous months, this particular robot became one of the most annoying and invading robots of my Blogspirit account. I already used the suggested method of including the “crawl delay” option in meta tags, tough the robot penetrates this small line of defense still. Dave Simpson suggests to enable Gzipped Files and Smart Caching to aid the process of restricting the robot’s aggressiveness. As I tested my subdomain with Netcraft, I was informed it’s Apache was using the mod_gzip/ already. Since most image files for the HTML layout and formatting reside on my own server, the smart caching is little to worry about. Overall, Slurp is already restricted in it’s behavior but still very aggressive. I will try to analyse the whole situation in a near future once I’m done with the final tests.

Link speichern...

Mein Name ist Mike Schnoor und ich präsentiere dem frivolen Internetuser ein erstklassig privat geführtes Medienblog. Die Themen siedeln sich zentral in der Medienwelt an: Web 2.0, Weblogs, Video on Demand, TV, Radio, Print, Medien, Marketing und Kommunikation.

Wer anderer Meinung ist, erreicht mich schnell über das Kontaktformular oder darf noch ein wenig weiter über alles lesen.

  • Online seit: 11. Dezember 2003
  • Anzahl Beiträge: 2247
  • Anzahl Kommentare: 6178
  • Anzahl Kategorien: 28
  • Anzahl Tags: 3227

© Copyright 1997-2008 by Mike Schnoor. All rights reserved. Telagon Sichelputzer is powered by WordPress: RSS Beiträge und RSS Kommentare
Über uns | Archiv | Kontakt | Login | Datenschutz | Impressum