I have very large files stored on a mapped network drive on my Windows 10 computer. I can open a Jupyter notebook and run calculations on these files by providing the full path to the files. However, after performing the calculations, I need to transfer any files created/saved to the mapped network drive so everything can be kept in the same place. Instead, I had the thought to open the Anaconda command prompt, cd into the mapped network drive, and then type jupyter notebook to open a notebook in the directory of interest. Does this cause the notebook/calculations to take longer to run? It seems to be slower, but I am curious if this is actually the case or just my perception. The calculations are being performed on files located on a mapped network drive regardless of where the Jupyter notebook is located.
Asked
Active
Viewed 40 times
1
-
Reading and writing the files will be slower. But the script execution itself just uses memory, not the drive, so it shouldn't be affected significantly. – Barmar Oct 06 '22 at 22:31