1

I am trying to create a Dynamic Rendering Example with "Prerender.cloud" service. I took an prerendered version of my URL.

I may put the code structure here but it is not the problem.

a) I purified the CSS. b) I deleted all unnecessary codes and resources. c) I changed the resource order/organisation for better speed. d) Optimised the Images. e) Reduced the request size. f) Reduced the page size.

I increased page speed mostly 3 seconds and decreased the size %200.

But the main problem is here:

I have a rootdomain.com/example-dynamic-rendering page. This is the original page. I want to serve this to the client.

And also, I have my prerendered example and I wonder how can I serve this static HTML page to the Search Engine User-Agents from same URL?

Do you have any idea or code for this task?

Plese help.

1 Answers1

0

You can use puppeteer to load that dynamic page/app and load it's HTML and save the content to a HTML file. Then if the Google bot crawler visit your site, you can ask them to crawl this HTML file, via robots.txt file.

You can run this puppeteer script every time you want to run. Maybe you can use cron to automatically run this script.

Something like this:

const puppeteer = require ('puppeteer')
const CronJob = require ('cron').CronJob
const fs = require ('fs-extra')

const crontask = '0 */1 * * *' // This will run script every hour
const urlDynamic = 'https://www.example.com' // Change this to your dynamic url
const staticFile = 'statichtml.html'

;(async () => {

    const browser = await puppeteer.launch ({
        headless: true,
        devtools: false
    })

    const [page] = await browser.pages ()

    const autorun = async () => {

        const open = await page.goto ( urlDynamic, { waitUntil: 'networkidle2', timeout: 0 } )

        const html = await page.content ()

        const save = await fs.writeFile ( staticFile, html )

    }

    const job = new CronJob(crontask, async () => {
        autorun ()
    })
    job.start()

})
Edi Imanto
  • 2,119
  • 1
  • 11
  • 17
  • Thank you for answer but I didn't get the part "Then if the Google bot crawler visit your site, you can ask them to crawl this HTML file, via robots.txt file." Robots.txt file is for telling Search Engine Bots what to do. I am trying to serve Static HTML only to Search Engine User-Agents, not clients. Clients will use the normal version of the page. – Koray Tuğberk GÜBÜR Dec 12 '19 at 23:49
  • Google bot crawler is the Search Engine bots. I'm confused, but which one are you trying to serve with this static HTML? The Google Bots or the Client? If the Google Bots, then you can save the statichtml.html full URL in the list of allowed URL in Robots.txt. I'm sorry if you can't understand my words correctly. – Edi Imanto Dec 13 '19 at 16:23
  • Let me ask this way; Hello, in a while I am doing a research about Dynamic Rendering. I am trying to implement it by myself. I am not a developer but still I believe that I can do it. I have created a sample Static HTML Web Page example. I want to serve this example only to Search Engine User-Agents while I serve the normal web page to the client at the same time. But, in a wordpress web site, how can I make this happen? – Koray Tuğberk GÜBÜR Dec 13 '19 at 23:47
  • Wordpress has almost every caching solution for this dynamic CMS. Are you trying to produce your own Wordpress plugin? – Edi Imanto Dec 14 '19 at 02:39
  • No brother, I am just simple trying to serve Different Content on Same URL to the Different User-Agents. I don't know how to do it, this is simple to understand but thank you for your trying. – Koray Tuğberk GÜBÜR Dec 14 '19 at 20:39
  • You should use browser detector to handle this. If your backend CMS is Wordpress, then you can search the solution in wordpress plugins. There are a lot's of plugins. – Edi Imanto Dec 16 '19 at 09:35
  • It is not something can be done with plugins but thank you for help Edi. Is there anyone else for helping? – Koray Tuğberk GÜBÜR Dec 16 '19 at 14:55