My problem is that everytime I start a new project, I need to install an IPython kernel to that project's virtual environment. This is a problem because the kernel and all its dependencies are in the requirements.txt of every project, even though they are only needed for development and not for the project per se. For example, pushing a simple one-page website to Heroku would only require flask
, gunicorn
and their dependencies. But because I'm developing using Jupyter, it also ends up "requiring" all the packages that come with the IPython kernel:
backcall, colorama, parso, jedi, decorator, pickleshare, six, ipython-genutils,
traitlets, wcwidth, prompt-toolkit, pygments, ipython, tornado, jupyter-core, pyzmq,
python-dateutil, jupyter-client, ipykernel
The problem is not exactly using virtual environments. I can change the environment in Atom and if I run my script from a terminal, it works just fine. But when I try to run it directly in the Jupyter notebook, the system level kernel doesn't see the functions in the virtual environment.
Here's how to recreate the problem. First, create a virtual environment. I use pipenv
, but also tried virtualenvwrapper
with the same results. Then, switch to the environment. I tried two methods and both worked:
- Launch atom normally and use the package atom-python-virtualenv to select my virtual environment (requires
virtualenvwrapper-win
and some configuration). - Using pipenv, run
pipenv run atom .
in windows cmd. (Some Atom packages might now find their dependencies with that solution. I fixed that by editing their respective options to always point to my system's python.)
With both methods, I can confirm that I'm in the right environment using a terminal in atom (I used platformio-ide-terminal
), then install flask
pip list # shows pip, setuptools and wheel
pip install flask
Finally, I launch the IPython kernel. Hydrogen detects the kernel installed at the system level, which is what I want. However, if I try import flask
inside the .py script, I get a ModuleNotFoundError
.
What I would like is for the kernel to detect the environment I'm in, and see the packages of that environment. In other words :
- Install Jupyter, the kernels and any development packages in my system environment only
- Launch Atom/Hydrogen. Because they are installed at the system level, they shuold be able to see Jupyter and my kernels at all times.
- Launch the kernel and have it see the packages in the system environment.
- Kill the kernel
- Switch to a new environment that doesn't have jupyter or kernels
- Launch the kernel, which detects the new environment and sees only the packages in that new environment.
I was thinking of sending parameters to the kernel on launch to indicate which Python folder to use, but it looks for its initialization functions in the folder where it is located.
Symlinks maybe?