0

My website uses ajax.

I've got a user list page which list users in an ajax table (with paging and more information stuff...).

The url of this page is : /user-list

User list is created by ajax. When the user click on one user, he is redirected to a page which url is : /member/memberName

So we can see here that ajax is used to generate content and not to manage navigation (with the # character).

I want to detect bot to index all pages.

So, in ajax I want to display an ajax table with paging and cool ajax effetcs (more info...) and when I detect a bot I want to display all users (without paging) with a link to the member page like this :

<a href="/member/john">John</a><a href="/member/bob">Bob</a>...

Do you think I can be black listed with this technique ? If you think so, could you please provide an alternative solution by keeping these clean urls and without redeveloping the user-list (without ajax) ?

Jerome Cance
  • 8,103
  • 12
  • 53
  • 106

2 Answers2

3

Google support a specification to make AJAX crawlable:

http://code.google.com/web/ajaxcrawling/docs/specification.html

I did an experiment and it works:

http://seo-website-designer.com/SEO-Ajax-Google-Solution

As this is a Google specification, you won't get penalised (unless you abuse it).

Saying that, only Google support it at the moment (AFAIK).

Also, I believe following the concept of Progressive Enhancement is a better approach. That is, create a working html website then make the JavaScript enhance it

Tony McCreath
  • 2,882
  • 1
  • 14
  • 21
  • As I said in my post, I don't want to use the # character because this is gwt specific. I know that is certainly better to use progressive enhancement but I use GWT and all content is generated via javascript. I can't waste time with an html version with pagination and other stuff. The goal is here to create a page without design and with minimal stuff to allow search engine index. – Jerome Cance Feb 15 '11 at 16:03
0

Maybe use the <a href=""></a> urls with an onclick to trigger your AJAX scripting? Like

<a href="/some/url" onclick="YourFancyFunction();return false;">Some URL</a>

I don't think Google would punish you for this, you primarily use JScript, but you do provide a fall back for their bot, so your site doesn't get any less accessible.

EDIT
Ok, I misunderstood. Then my guess would be you basically have two options:
1. Write a different part of your site where bots end up, or, 2. Rewrite your current site to for example always give a 'full' page, with an option to only get, say, the content div. Then you can get only the content with JavaScript, but bots will always get a nice page.

Zsub
  • 1,799
  • 2
  • 15
  • 28
  • Thanks for answer but I can't do that because like I said in my post, the ajax version generates the content. So, I must generate another content for bots. – Jerome Cance Feb 15 '11 at 13:22
  • If you can generate content for bots, then you can generate it for everybody and then progressively enhance. Build on stuff that works. http://icant.co.uk/articles/pragmatic-progressive-enhancement/#build – Quentin Feb 15 '11 at 13:38
  • I know that is certainly better but I use GWT and all contents is generated via javascript. I can't waste time with an html version with pagination and other stuff. The goal is here to create a page without design and with minimal stuff to allow search engine index. So if my approach is not black listed, that does the trick. And this is really why I ask this question. – Jerome Cance Feb 15 '11 at 16:02