0

I am building a single page javascript powered website. I have all necessary data for all pages echoed using php as json objects on my home page. Then i initialize pages using custom plugins made for each page, which dynamically builds the dom using the relevant json data, which i pass to the plugin, so i don't have any ajax requests. Links on my website are in the following format !#about, !#home, etc... And currently the plugin's init methods are called on hashchange. What should i do to make these pages crawlable by google bots and how to make different title and description meta tags for each of these pages?

I've tried various things i found in google docs and on many other websites. I've changed links from #mylink to #!mylink , so google should interpreted that as get _escape_fragment_ variable, then i've tried to add this chunk of php code:

if ($fragment = $_GET['_escaped_fragment_']) {
    header("Location: Project.php?id=$fragment", 1, 301);
    exit;
}

where project.php is an html snapshot with relevant information, which i want to be crawled. Basically just core information. But as far as i seen nothing happens... :( After all is there a way to achieve this without AJAX requests?

phant0m
  • 16,595
  • 5
  • 50
  • 82
hjuster
  • 3,985
  • 9
  • 34
  • 51
  • Why do you build the page dynamically on the client? It's just making it harder for Google's bots and for those who have Javascript disabled. – beatgammit Nov 26 '12 at 00:52
  • Because i have some fancy transitions between pages to maximize user experience... Kind of you have probably seen on awwwards.com and other showcase websites... – hjuster Nov 26 '12 at 00:53
  • 1
    "maximise user experience" sounds like a ... euphemism. – John Carter Nov 26 '12 at 01:07
  • I don't see how your comment is helpful? :( If you think that kind of effect is exaggerated you should better talk to my art director... – hjuster Nov 26 '12 at 01:10
  • "i have some fancy transitions between pages to maximize user experience" - i hope you have a way for the user to disable them – HenchHacker Nov 26 '12 at 01:18
  • Mate, where have you seen that kind of option?! On which website? This is what i am going for and what i am trying to achieve http://ff0000.com/ . Actually i have already built that, and now i just want google to crawl my pages.. – hjuster Nov 26 '12 at 01:25

2 Answers2

1

Google has actually published how to make ajax crawlable - who better to tell you how!?

https://developers.google.com/webmasters/ajax-crawling/

Direct Links From That Page

Alternative Guide

If you find that hard to follow, try this one on SitePoint that runs you through how it's done: http://www.sitepoint.com/google-crawl-index-ajax-applications/

HenchHacker
  • 1,616
  • 1
  • 10
  • 16
  • That documentation is really poor, no real/live examples. Maybe i am not smart enough, but i couldn't understand an actual way of doing this. – hjuster Nov 26 '12 at 00:58
  • Try that new link (that points to sitepoint) and see what you think? – HenchHacker Nov 26 '12 at 01:00
  • No offence, but you are likely going to have a hard time getting google to index your site if you don't follow their suggestions and the way others are recommending (even though that's the same as google lol). The fact you've seen the million pages before kind of suggests that "is the way to do it" - or at least is the way if you want google to play nice with your site. – HenchHacker Nov 26 '12 at 01:16
0

Well, the only way is to build a sitemap and add links to xml sitemap file to each page; submit your sitemap via google webmaster tools.

Andrew
  • 7,619
  • 13
  • 63
  • 117
  • I've read somewhere that _escape_fragment_ can do the job? Can that be applied in my case? And if it can, how? If not... you suggest to manually create sitemap with links like f.e. http://mysite.com#!about, right? – hjuster Nov 26 '12 at 00:50