0

I am working on a website with 500 000 products and we are trying ISR to keep the pages up to date. Things like price can be SEO sensitive, so it would be nice if we are not showing yesterday's prices.

Here are the possible solutions I see:

A) Pretend that it is fine with a few hour long revalidate timeout and hope that google will not penalise due to too many product prices not matching the product feeds that we send them. (Unpopular products won't get updated, may be crawlers can help us in this case?)

B) Correct the price client side. (Can't ensure which price the bot will see)

C) Redeploy and somehow mass rebuild all pages nightly.. (are we even talking about ISR at this point)

D) Use on-demand revalidation (No way vercel's lambdas can handle the whole site rebuilt)

Frequent deployments would also kill what has been cached, mass cache warming would be very much against the spirit of ISR.. I guess we shouldn't DoS our own website just to keep a numeric value up to date.

All of these solutions sound hacky to me and I am wondering if I am overthinking this and being too cautious. Can anybody share their experience with running large scale e-commerce websites using Next and ISR? I am not looking for a specific prescription as everybody's use case varies slightly in the semantics, just general advice that I won't find in the docs.

Ivan
  • 1
  • 1

0 Answers0