1

Background

I am wrangling some legacy code into shape.

I use PDM to manage dependencies, which places all dependent packages in a __pypackages__ folder directly under the repo root level. PDM also uses the relatively new pyproject.toml package config file.

I am trying to adopt pre-commit Git hooks so that I can have automated checks for formatting and style before trying to commit, merge, and/or create PRs.

I am asking pre-commit to use only a few Python tools for now: pylint and black.

Issue

Most of that toolset works great together. However, pylint cannot find any of the modules that are stored in the __pypackages__ folder. Most of what I have read suggests that I alter my $PYTHONPATH to find the modules.

This solution seems very outdated. But also, I am not sure how I can do this in a robust way across the team. I can alter the Git hooks, but the $PYTHONPATH may be different for each engineer, so this will only work for my machine.

I would like to be able to add something in the pyproject.toml file to have pylint find it. I am not sure what to write, though, so that it generically works across the whole team. Something like

[tools.pylint]
pypackages = "./__pypackages__"

Any ideas how I can do this?

Details

I am not sure more details are needed, but here it is:

My actions:

> pre-commit run --all-files # The --all-files flag is just to allow me to test without a commit
Trim Trailing Whitespace.................................................Passed
Fix End of Files.........................................................Passed
Check Yaml...........................................(no files to check)Skipped
Check for added large files..............................................Passed
black....................................................................Passed
pylint...................................................................Failed
- hook id: pylint
- exit code: 30

************* Module testfile
testfile.py:18:0: E0401: Unable to import 'boto3' (import-error)

boto3 is in the __pypackages__ mentioned above. None of the modules can be imported, but I limited the output for clarity.

I can pdm run ... everything correctly and VS Code sees the modules fine. But pylint is not finding it because it cannot find this __pypackages__ folder.

Larry Cai
  • 55,923
  • 34
  • 110
  • 156
Mike Williamson
  • 4,915
  • 14
  • 67
  • 104
  • I don't know a lot about PDM but did you try installing pylint with pdm and then using a system hook for pylint ? Your pre-commit conf is not shown and it's important here as pylint need to be in the same env as your dependencies. – Pierre.Sassoulas Nov 16 '21 at 17:11
  • It's possible to write Python code such that pylint *can't* handle it, because pylint doesn't actually *run* it so any secret changes you've made to `sys.path` in a module don't happen. Pylint does however try to handle normal import cases by simulating what normal Python will do with normal `import` directives in normal situations. It's when you step too far outside these norms that this stuff fails. That might not be the problem, but it's worth considering. – torek Nov 16 '21 at 19:50
  • That makes sense, regarding `sys.path`. But I guess what is frustrating is that the `__pypackages__` folder *is* inside the root repo and *is* part of accepted Python package structures. So pylint should know to treat the root path `__pypackages__` as a location to check for imports, the same way it checks any folder with a `__init__.py` file in it. I guess I just have to wait for pylint to catch up to PEP 582. – Mike Williamson Nov 17 '21 at 22:25

1 Answers1

2

You can get around this by updating the PYTHONPATH environment variable used by the extension, by creating a file named .env in your workspace (project folder) and adding the following entry:

PYTHONPATH=D:/commonScripts

Note: Relative paths are also supported. Further info on .env files can be found here https://code.visualstudio.com/docs/python/environments#_environment-variable-definitions-file

Frost Ming
  • 232
  • 2
  • 9