[ubuntu] How to limit server access?

view story

http://ubuntuforums.org – I got a problem with a search engine crawler. It is nice when they come, but now they kill my site: Bing e.g., found my blog and crawles back to day 1, - not with 1, but with xx instances. Server load I could see was 96 !!!! I want to limit access to the server so that the load is in a "normal range" Is there a way to limit search engines to one instance only? Is there another way to limit acces when it is geting critical? The wordpress blog uses a database and this one is at his moment still on the same cloud instance. I hope that I can get this out into another instance s (Hardware)