0

My problem is that everytime I start a new project, I need to install an IPython kernel to that project's virtual environment. This is a problem because the kernel and all its dependencies are in the requirements.txt of every project, even though they are only needed for development and not for the project per se. For example, pushing a simple one-page website to Heroku would only require flask, gunicorn and their dependencies. But because I'm developing using Jupyter, it also ends up "requiring" all the packages that come with the IPython kernel:

backcall, colorama, parso, jedi, decorator, pickleshare, six, ipython-genutils, 
traitlets, wcwidth, prompt-toolkit, pygments, ipython, tornado, jupyter-core, pyzmq, 
python-dateutil, jupyter-client, ipykernel

The problem is not exactly using virtual environments. I can change the environment in Atom and if I run my script from a terminal, it works just fine. But when I try to run it directly in the Jupyter notebook, the system level kernel doesn't see the functions in the virtual environment.

Here's how to recreate the problem. First, create a virtual environment. I use pipenv, but also tried virtualenvwrapper with the same results. Then, switch to the environment. I tried two methods and both worked:

  1. Launch atom normally and use the package atom-python-virtualenv to select my virtual environment (requires virtualenvwrapper-win and some configuration).
  2. Using pipenv, run pipenv run atom . in windows cmd. (Some Atom packages might now find their dependencies with that solution. I fixed that by editing their respective options to always point to my system's python.)

With both methods, I can confirm that I'm in the right environment using a terminal in atom (I used platformio-ide-terminal), then install flask pip list # shows pip, setuptools and wheel pip install flask

Finally, I launch the IPython kernel. Hydrogen detects the kernel installed at the system level, which is what I want. However, if I try import flask inside the .py script, I get a ModuleNotFoundError.

What I would like is for the kernel to detect the environment I'm in, and see the packages of that environment. In other words :

  1. Install Jupyter, the kernels and any development packages in my system environment only
  2. Launch Atom/Hydrogen. Because they are installed at the system level, they shuold be able to see Jupyter and my kernels at all times.
  3. Launch the kernel and have it see the packages in the system environment.
  4. Kill the kernel
  5. Switch to a new environment that doesn't have jupyter or kernels
  6. Launch the kernel, which detects the new environment and sees only the packages in that new environment.

I was thinking of sending parameters to the kernel on launch to indicate which Python folder to use, but it looks for its initialization functions in the folder where it is located.

Symlinks maybe?

Lokdal
  • 183
  • 1
  • 1
  • 6
  • Does [this](https://github.com/bhrutledge/jupyter-venv) solves your issue? You can have a global Jupyter and set a kernel inside each of your environments. It's not exactly your use case but seems pretty close to me – yorodm Jan 24 '19 at 15:34
  • That is actually what I'm trying to avoid. The kernel and all its dependencies get installed in the environment (like in my Heroku example). – Lokdal Jan 24 '19 at 15:38
  • 1
    So you want a to run a kernel from **outside** a virtualenv and access packages that are **inside** such virtualenv? – yorodm Jan 24 '19 at 15:40
  • Let me rephrase what you said to be sure we're saying the same thing. While inside environment B, I want to launch the kernel in environment A such that this kernel detects the packages of environment B. In other words, I don't understand why I would need multiple kernels. – Lokdal Jan 24 '19 at 15:45
  • @yorodm Is there something obvious I'm missing from your question? Are you trying to imply it is impossible? If so, could you explain? – Lokdal Jan 24 '19 at 17:20
  • Your question is fine, some of it may got lost in translation to me. I don't like to call things impossible per se, but it's sure hard thoug :D – yorodm Jan 24 '19 at 17:26
  • This is actually a really great question. More modern virtual environments like mamba support the idea of "appending to" or "decorating" a common "shared" environment (that isn't `base`) with a few additional packages. See [here](https://mamba.readthedocs.io/en/latest/user_guide/concepts.html#activation) for more – stephenjfox Jan 25 '23 at 16:32

0 Answers0