So I have created a single page web app using Firebase where each page has content dynamically loaded from the Firebase Database.
Search engines however, would only see blank pages and not the dynamic content. I created a Firebase function to pre-render each page for SEO purposes, which has worked great.
The issue is that this has majorly affected the user experience as there is an extra delay from the function being run, followed by a FOUC when the dynamic content is loaded with all other JS.
Is it possible to only trigger the pre-rendering function for GoogleBot (and other know crawlers/bots) allowing the normal website experience for users and a pre-rendered html page for bots.
Thanks
Edit:
exports.helloWorld = functions.https.onRequest((request, response) => {
// console.log(request.useragent)
}
The user agent expected is:
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36"
However has snippet appended to it:
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36 AppEngine-Google; (+http://code.google.com/appengine; appid: s~gcf-http-proxy)"
I have tried several plugins that detect bots, however each of these report everything as a bot due to AppEngine-Google