I have a largish nodejs based web app, with both server and client components working together. I am currently deploying the app, but using git pull
to take my latest production branch from the server repository. A git post-commit
hook runs to do a npm install
and a rebuild of the servers .env
file, and PM2 is monitoring the various processes (3 web servers) using a change in the .env file to restart them.
node_modules is at the highest level of the project with separate server and client subdirectories. Since this is using http2 on a fast lan, I don't bother compressing the client files with web-pack or the like, although I do use rollup on lit-element
and lit-html
to sort out the import statements (they are not relative or absolute) that they have embedded in them.
I've just been reading that I should really have been doing an npm ci
for my node dependencies, but reading the instructions for that it says it blows away the node_modules directory and starts again (whereas npm install
doesn't). Since this is all running on a raspberry pi its not instantaneous.
I am not sure a temporary loss of node_modules should effect a running app too much - after all I believe the modules will all have been cached into memory, but it just might have not and there also a possibility that one of the servers falls over and pm2 restarts it, so I am wondering ....
So what is best practice here. Is it possible for instance to copy package.json
, package-lock.json
to a special build
subdirectory, build the node_modules
directory there and then move it back into place. Once built. Or is there a better way?