The term you are looking for is 'cloaking' and it's harshly penalized by search engines such as google: https://support.google.com/webmasters/answer/66355?hl=en&ref_topic=6001971
Cloaking refers to the practice of presenting different content or
URLs to human users and search engines. Cloaking is considered a
violation of Google’s Webmaster Guidelines because it provides our
users with different results than they expected.
Some examples of cloaking include:
Serving a page of HTML text to search engines, while showing a page of
images or Flash to users Inserting text or keywords into a page only
when the User-agent requesting the page is a search engine, not a
human visitor If your site uses technologies that search engines have
difficulty accessing, like JavaScript, images, or Flash, see our
recommendations for making that content accessible to search engines
and users without cloaking.
If a site gets hacked, it’s not uncommon for the hacker to use
cloaking to make the hack harder for the site owner to detect. Read
more about hacked sites.
Google also penalize sites for various number of reasons, both for reasons of having their search results be relevent, not to redirect users to websites that are painful to use due to the amount of interstitial ads, and probably opaquely to make their Google ads more appealing to people then ads that are more intrusive.
In short, it's a bad idea, and will your site will get caught, and your site will suffer as a consequence.
That said, you should be able to filter the content based on the user-agent. Most well behaving bots will advertise that they are bots, but not all.
Unless you have an explicit list of ip addresses to serve different content to, you won't be able to catch the bots that impersonate users easily without using underhanded techniques.
This makes me ask,
The client asked about not showing advertise banners for bots, because
company losing money as a result.
Exactly how are they losing money as a result? If it's 'lost profits' then it's not losing money. The bots would never have responded to the ads anyway.
If it's bandwidth, then the cost is minimal compared to the loss you will get, if you serve content differently to bots then humans and get caught.
If it's that the bots are then re-serving your content to your users, filtering the ads, then you need to outright block those bots somehow, or otherwise get them to prove they are human before continuing, a CAPTCHA of sorts would be best.
If it's a simple reporting issue, MOST bots will generally report they are bots, and google analytics should be able to filter them with some tweaking, and the ones that arn't can't be easily distinguished anyway.