The issue
I run PyCharm Professional 2023.1 on Windows with miniconda.
I need to write a combination of small(ish) scripts and notebooks, and I'd ideally need the code to be on a network drive. In some cases I'll use a local git repository for version control, in other cases the scripts are so banal that's not even necessary. This banal task, which worked very well with previous versions, is de facto impossible with PyCharm 2023.1 because it becomes so slow it's unusable. E.g.:
- if I create an empty project, before even adding any new files on it, PyCharm behaves like a software from the '80s: I click on a menu and I have to wait 5-6 seconds before anything happens.
- I fire up a new Jupyter notebook and, after waiting for the notebook to start, even running a banal cell like "5+5" can take.. 12 seconds!
My questions are: does this happen to other people, too? Have you found better solutions than those I have found below?
My (partial) solutions
- Use Spyder for small scripts and notebooks. Where the extra functionalities of PyCharm come in handy:
- Create a local project on the C drive, then set up the deployment options so as to sync files with the remote drive. This means you can no longer use relative references to import data on the network drives. If you remove the exclusion for ".git" from the deployment options, PyCharm's deployment tool will sync a local git repository to the network drive, too.
- Create a git repository on the network drive, and use PyCharm to clone from that repository.
- if you have access to some kind of centralised git repository or equivalent, you can obviously sync between repository - local drive - network drive, but this may add a level of unnecessary complication if the scripts are very small and banal
Why I think it's a bug
I am well aware that large projects should not be stored on network drives (see Why I need it farther down) but this is not that.
I understand that scanning a large code base on a network drive is inefficient, and I understand that IDEs like Spyder are less sophisticated, do less background scanning and will therefore work better with network drives.
But taking 12 seconds to run a notebook cell with "5+5" is laughable, there is no justification, there is something very wrong going on in the background.
Not to mention that previous versions of PyCharm worked perfectly well - on the very same network drives, so something must have changed with the updates.
How I have tried to debug it
I reached the conclusion the issue is having the files on the network drive, because
- I created a new conda environment with just python and spyder
- The new environment gives me the same issues if I create a project on a network drive
- If instead I create a project on the local C drive, both environments work perfectly well
I have also:
- disabled synchronize external changes in settings -> system settings
- turned power save mode on
Why I need it
I am well aware that large projects should be stored on git repositories etc. But here we are not talking about production-critical application. The use case is something like this:
- the work network drive has folders by clients and projects
- as part of the work on a specific project for a specific client, we receive some data from the client. We need to perform some light exploratory analyses on that data, and it is useful to have the scripts and the output in a subfolder of the project, all together.