Brett Tabke, owner of WebmasterWorld, has given me the privilege to ask him a bunch of questions on the recent news that WebmasterWorld Bans Search Engine Bots from Crawling. So here it is....
Barry: Brett...Thank you for taking the time during this hectic period at WebmasterWorld to answer several questions about the recent changes you have made, to disallow spiders from accessing your site.
Barry: The big change was that you, on November 18th, changed your robots.txt file to disallow all bots from accessing your Web site. In a thread you started in the Foo forum at WebmasterWorld named lets try this for a month or three... you elegantly linked to your robots.txt file to show people. And subtitled the thread, "last recourse against rogue bots." Why was this the last course of action? I have spoken with dozen of site owners who run sites as large as yours. Most tell me that you can fight off these rogue bots one by one, but you need to factor in the costs of these bots into your hosting prices. How would you respond to that?............