Yahoo! Slurp – Please do what Google is doing

Matt Cutts posted this on his blog, which is pretty much exactly what I remember him explaining in a session at Pubcon.  It does explain why I’m seeing less of the Googlebot at my sites recently, and I’m very happy to have it explained.  The short version (see his site for the long version) is that instead of Googlebot, Newsbot, and Mediabot (Adsense) all doing a crawl of your websites, Google will now have something called a “Crawl Caching Proxy” which will allow all of the bots to share the same information, thus reducing the number of times that Google needs to visit your site.  Thus, if Mediabot just crawled, Googlebot wouldn’t need to do it again right then.  Benefits?  The reduction in the number of times your site is crawled should save you and Google on bandwidth, and probably Google on processing power too.  I have already noticed the difference, and I used to notice the Googlehump a lot on my sites.  Have a look at this site if you want to see the Slurphump.  I have it set up so it will identify each of the major bots, as well as registered members.  Over the past few months I’ve been seeing Slurp at a ratio of ten to one over registered users.  I hope Yahoo! is working on a more efficient bot.

Leave a Reply

Your email address will not be published. Required fields are marked *