There's a way of excluding complete page(s) from google's indexing. But is there a way to specifically exclude certain part(s) of a web page from google's crawling? For example, exclude the side-bar which usually contains unrelated contents?
Asked
Active
Viewed 4,207 times
2 Answers
5
You can include with an IFRAME tag the part of the page that you want hide at Googlebot and block the indexing of the file included from the robots.txt file.
add the iframe for include the side-bar in your page
<iframe src ="sidebar.asp" width="100%" height="300">
</iframe>
here the rules to be added in the robots.txt file for block the spider
user-agent: *
disallow: sidebar.asp

Rinzi
- 662
- 5
- 4
-
this is generally a good mechanism, but may have downsides for normal users. – Jason Jan 06 '10 at 17:40
-
this does look good, however my sidebar is dynamic and is hard to be separated out. – bryantsai Jan 07 '10 at 04:10
-
Looks like unless Google explicitly support it, like the one supported by AdSense, this is the only way ... – bryantsai Jan 11 '10 at 12:56
1
If you're doing this for AdSense, here's an article on how to exclude content from the scraper. If you don't want Google to follow links, you can give them a rel="nofollow"
attribute. Otherwise, I'm afraid you may be out of luck here.
Something else you could do, but I wouldn't necessarily recommend doing, is detecting the user agent before rendering your page, and if it's a spider or bot, not showing the portions of your page you want to exclude.

Jason
- 51,583
- 38
- 133
- 185
-
1
-
hence the "but i wouldn't necessarily recommend doing" part of my statement :) – Jason Jan 06 '10 at 17:39