I am building a single page javascript powered website. I have all necessary data for all pages echoed using php as json objects on my home page. Then i initialize pages using custom plugins made for each page, which dynamically builds the dom using the relevant json data, which i pass to the plugin, so i don't have any ajax requests. Links on my website are in the following format !#about, !#home, etc... And currently the plugin's init methods are called on hashchange. What should i do to make these pages crawlable by google bots and how to make different title and description meta tags for each of these pages?
I've tried various things i found in google docs and on many other websites. I've changed links from #mylink to #!mylink , so google should interpreted that as get _escape_fragment_ variable, then i've tried to add this chunk of php code:
if ($fragment = $_GET['_escaped_fragment_']) {
header("Location: Project.php?id=$fragment", 1, 301);
exit;
}
where project.php is an html snapshot with relevant information, which i want to be crawled. Basically just core information. But as far as i seen nothing happens... :( After all is there a way to achieve this without AJAX requests?