12

I have a Node.js project (actually a Firebase project) where I have the code on Google Drive. (I could use for example Dropbox instead here. The important thing is that the code files are mirrored.)

Now I want to develop this project on another computer too. I do not understand how to handle the node_modules directory. There is currently a stunning 15 000 number of files there. Should I exclude them from mirroring?

Or is this just a bad setup?

Leo
  • 4,136
  • 6
  • 48
  • 72
  • 2
    Why aren't you using some sort of standard source control for your project? Cloud Storage really isn't the best solution for managing your code. – Doug Stevenson May 18 '18 at 15:26
  • 1
    Thanks @DougStevenson, but I have practical reasons for doing it this way for now. I will use a source control system when I publish the project, but my way of programming in the beginning is rather much "eh, shit, I need to refactor again". – Leo May 19 '18 at 00:28
  • 2
    I have an issue like this, even using source control. The reason is that I may have in-progress work on one machine, then go to another machine and want to pick up on files outside of source control (such as IntelliJ configuration folder) and to have the in-progress work synced. One key with this is keeping the IDE, the OS and other tangential software (such as Node) on (roughly) the same versions. – ryanm Jul 30 '20 at 16:21

6 Answers6

4

Since the node_modules folder is being created with the npm install command, you could exclude the folder entirely when uploading to your Google Drive folder.

When fetching the files from the other computer, simply run the command again to install all the dependencies in the node_modules folder.

Dan Tr R
  • 41
  • 2
  • 8
    One trick is to manually add the node_modules folder before running npm install, so you can exclude it from syncing before it gets dozens of thousands of files – Eric Majerus Aug 10 '18 at 19:48
  • Unfortunately, the trick by @EricMajerus doesn't work on `OneDrive`. In that case, the local folder is simply removed :( – Daniel Danielecki Apr 08 '21 at 10:41
  • 1
    @DanielDanielecki it works for me, sort of.. After `OneDrive` removes the folder, I manually re-create it, and then I run my `npm init` or whatever. The folder gets populated but never syncs to the cloud. Note that `OneDrive` will complain and show an error. Unfortunately, after a few years of doing this, I have 40+ of these errors showing, and I think that's causing `OneDrive` to take ages to sync other files/folders. – Mike Willis Mar 30 '22 at 16:02
3

This always bothered me with Dropbox, because the selective sync is a hassle. You have to select every single folder you want to exclude. There were so many requests for a .dropboxignore file, but Dropbox wouldn't listen. But I don't know if there are any Cloud Storages which would offer this feature.

EDIT: Dropbox seems to be working on feature which is currently in beta https://help.dropbox.com/en-us/files-folders/restore-delete/ignored-files

Anyways: Using git is always good, but if you are working on several devices and you don't want to commit and pull/push your work everytime you switch devices then you can store your git inside your Dropbox. But still there are the node_modules being synced for hours for nothing.

To solve this problem I built a tiny function that moves node_modules outside of the Dropbox and then creates softlinks to them. So Dropbox is only syncing the link-file, not the contents.

DROPNODE=/usr/local/node_modules_dropbox
init_dropnode() {
  if [ -d $DROPNODE ]; then
    mkdir -p $DROPNODE
  fi
}

dropnode(){
  bp=$DROPNODE/`basename "$PWD"`
  p=$bp/node_modules
  if [  -L "./node_modules" ]; then
    rm node_modules
    mkdir -p $p
  else
    if [ -d "./node_modules" ]; then
      mkdir -p $bp
      mv node_modules $p
    else
      mkdir -p $p
    fi
  fi
  ln -s $p node_modules
}

I put these in my .bash_profile so when I'm in a project folder I just do dropnode and the hassle is outside of my Dropbox.

Disclaimer: this only works on Linux/MacOs and you need no set the same path for the node_modules dump folder on every computer. And if you are mounting code into a Docker container, this will not work, because symlinks are being ignored.

Martin Cup
  • 2,399
  • 1
  • 21
  • 32
2

With dropbox allowing to ignore files/directories using the terminal it is now quite easy to do.

I built a node CLI Dropbox Ignore node_modules to do this very easily.

Trent
  • 141
  • 9
1

Its is really a bad setup. It is advisable to use Version control system to handle source code.

Few Version control system are,

  • Github Free & Open source VCS, options for everyone to have access to view & clone your source code in your repository, or to have private repositories
  • Bitbucked Private repositories only

While publishing code in VCS / Drive / Dropbox dont include node_modules folder. All the modules will be reinstalled using npm install. But make sure that all your neccessary modules were included in package.json dependencies.

For an example, User A adding source in VCS/Drive/Dropbox without node_modules. Once User B get access to download the source. He will download and execute the command npm install. Once all the module installed node_modules folder will be created automatically in User B machine.

Ashok JayaPrakash
  • 2,115
  • 1
  • 16
  • 22
  • Thanks, this was what I needed. I did not know about Bitbucket. And `npm install` seems to be what I want at the moment. – Leo May 19 '18 at 00:25
  • 9
    I don't think this is a bad setup at all. If you have two personal computers (like me, a desktop and a laptop) there are times when I leave my desktop with uncommitted work that I want to continue with on my laptop. _After_ I'm finished, then I commit and push to BItBucket. – Eric Majerus Aug 10 '18 at 19:45
  • 7
    Version control has nothing to do with utilizing local/cloud backups for data on your workspaces. This is like responding "you shouldn't use RAID, use GitHub instead", or "you shouldn't use M.2 solid state drives in your computer, use GitHub instead". Non sequitur. – Marcel Besixdouze Jul 16 '20 at 19:08
1

Sadly I know your pain and I haven't found any solution for now, for Google Drive and OneDrive my favorite choices.

I have project folders that have a lot of subfolders like source images, design, etc. I want these folders synchronized to OneDrive, but also my subfolders like node_modules are also synchronized and I can't find a way to avoid this.

I use Git for source code, but I do change computers often (desktop and notebook) and it is a real pain to relly only on Git to do this. I'll have to keep cloning projects every time and sometimes I only want to do some training/practice, and I don't want to upload it to Github.

Maybe using a third-party Sync software like Insync can help. I'll try to use their trial period to check.

R. Marques
  • 99
  • 1
  • 2
  • Thanks. I have withdrawn to Git. I wanted to avoid to setup server for that, but I found a solution. – Leo Mar 13 '19 at 18:40
  • 1
    You can try a solution like InSync. I'm using it in my Linux Notebook and my Windows Desktop. Like in Git, you can configure the Insync software to ignore folders like `node_modules`. You can select to not upload, not download or both. Sadly it's not a free solution, but it works great at least. – R. Marques Oct 28 '19 at 16:54
0

I know this question is old but it's what I found when I looked for guidance on this. Using the Dropbox feature How to set a Dropbox file or folder to be ignored mentioned above by two responders, I wrote this one-liner to traverse my entire Dropbox and mark all .git, node_modules, and packages folders to be ignored by Dropbox. Just change the parameters between the parentheses to whatever you want to be ignored. The -o means "or" and has to be included between match patterns if you want more than one. Here's what I used on my machine:

find ~/Dropbox \( -name 'packages' -o -name 'node_modules' -o -name '.git' \) -print0 | xargs -0 xattr -w com.dropbox.ignored 1

This recurses through every folder in your dropbox (assuming it's at ~/Dropbox, adjust if not) and uses xargs to call xattr -w com.dropbox.ignored 1 on each result, as instructed on Dropbox's support page. To undo it, keep the find command as-is and replace -w com.dropbox.ignored 1 with -d com.dropbox.ignored, e.g.

find ~/Dropbox \( -name 'packages' -o -name 'node_modules' -o -name '.git' \) -print0 | xargs -0 xattr -d com.dropbox.ignored

Caveats:

  1. Works on 100% of the one machines I've tried it on (macOS Monterey)! Caveat emptor!
  2. If you add a new git project you'll need to run this again, as the script only marks the actual files/folders it finds to be ignored, it doesn't do anything for any future folders that would match the find statement.
  3. I don't imagine there's any harm in marking folders ignored multiple times but maybe there is? If so, that would be relevant for point #2 above.

I had 9.3GB worth of files in my Dropbox folder and after running this, Dropbox reports I am only using 5.1GB! Since my free account maxed out at 8.6GB (referral bonuses) this solved my out of space errors and gave me a huge margin of remaining space! Hope this helps others.

UPDATE 2022-08-08: This failed for me on a path containing spaces, since by default xargs views any whitespace as a delimiter. Based on this answer elsewhere I added -print0 and -0 in the commands above. -print0 tells find to use null as the delimiter instead of a newline and -0 tells xargs to only use null as delimiter instead of all whitespace.

Joel P.
  • 869
  • 10
  • 20