After my CPU usage suddenly went over 400% due to bots swamping my site, I created a robots.txt as followed and placed the file in my root, eg "www.example.com/":
User-agent: *
Disallow: /
Now Google respects this file and there is no more occurence in my log file of Google. However BingBot & BaiduSpider still show up in my log (and plentyful).
As I had this huge increase in CPU usage & also bandwith and my hosting provider was about to suspend my account, I firstly deleted all my pages (in case there was a nasty script), uploaded clean pages, blocked all bots via IP address in .htaccess & then created that robots.txt file.
I searched everywhere to confirm that I did the right steps (haven't tried the "ReWrite" option in .htaccess yet).
Can anyone confirm that what I have done should do the job? (Since I started this venture, my CPU usage went down to 120% within 6 days, but at least blocking the IP addresses should have brought down the CPU usage to my usual 5-10%).