The Live Search Webmaster Center blog just posted a follow up regarding numerous complaints about the behavior of MSNbot, which last month was wreaking havoc on some of the websites it was crawling, according to reports.
Last week, a patch to the crawler was deployed, which should significantly reduce the problems. Interestingly, while part of the problem involved “cloaking detection”, where websites try to trick search engines into better positioning, some of the problems were also caused by attempts to gain much quicker RSS feed information:
Unfortunately, things can and do go wrong from time to time. The initial complaints, that we were over-crawling some servers with our cloaking detector, was compounded by and also confused with the new release of our feed crawler that was also overzealous in its attempt to crawl and provide up-to-the-minute results. However, we have taken all of the feedback you have provided and made some improvements.
The blog post goes on to suggest ways to optimize websites to help to get feed content out quickly, including aggregating feeds, and using sitemaps and robots.txt.