I created a new virtual environment via this command
python3 -m venv test_v4
I check that in my activate file that this config is correct
/Users/user/PycharmProjects/test-job/test_v4
I then check correct pip is in fact activated
/Users/user/PycharmProjects/test-job/test_v4/bin/pip
But when I run the pip install with multiple requirements file I get this error
ERROR: Command errored out with exit status 255:
command: /Users/user/.pyenv/versions/3.7.10/bin/python3.7 -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/cs/mqdnkvqx3ml4dglkn7q6_tp80000gp/T/pip-resolver-0honk05v/databricks-connect/setup.py'"'"'; __file__='"'"'/private/var/folders/cs/mqdnkvqx3ml4dglkn7q6_tp80000gp/T/pip-resolver-0honk05v/databricks-connect/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /private/var/folders/cs/mqdnkvqx3ml4dglkn7q6_tp80000gp/T/pip-pip-egg-info-pxswo4sk
cwd: /private/var/folders/cs/mqdnkvqx3ml4dglkn7q6_tp80000gp/T/pip-resolver-0honk05v/databricks-connect/
Complete output (1 lines):
Found conflicting `pyspark` installation at /Users/user/.pyenv/versions/3.7.10/lib/python3.7/site-packages/pyspark. Please uninstall this with `pip uninstall pyspark` before installing databricks-connect.
Just by looking at the error message I ahve a question. Is "pip install" trying to access globally what is in 3.7.10? I was expecting to see pip trying to store files in this folder structure.
/Users/user/.pyenv/versions/test_v4/lib/python3.7/site-packages/pyspark
or atleast
/Users/srinivaspachari/.pyenv/versions/3.7.10/test_v4/lib/python3.7/site-packages/pyspark
The reason is this is whole point of creating a virtual environment. Is my understanding correct? Also if anyone can tell me how to fix this issue that will be great.
I even tried
pip uninstall pyspark
I get this error messaage
WARNING: Skipping pyspark as it is not installed.