first time question-asker on stack-overflow.
I am working on a reactJS SPA, and need to optimize for SEO. To that end, I've integrated react-helmet to dynamically update metadata with the high-level components that are viewed. Unfortunately this isn't consistently sufficient for Google or Facebook's crawlers. After some research, I learned that react-snapshot was a good option, as it crawls the SPA and generates static html files to be served initially as the js package loads - which should result in crawlers seeing the appropriate metadata for each route.
I've integrated react-snapshot as the documents suggested, and by running the
npm run build
command, all of the static files are generating as expected. However, when I start the system locally (port 3000) for testing, it looks like the static files are not served to the browser. By looking at the source, it shows that the default index.html is all that is showing up. I don't know what I am either not doing, or am doing wrong. Any advice would be appreciated!
UPDATE: So after two days of head -> wall I realized that create-react-app doesn't automatically host the build version (which is where the static pages are contained). In order to see if the files are being served correctly, it is best to globally install the serve package, and use that to temporarily host your app on the local machine.
npm install -g serve
Once it's installed, build the application as outlined in the react-snapshot documentation, and then run the build version of your app using:
serve -s build