I have a production PC that won't have a decent Internet connection (e.g. let's say just the bare 'apt-get install ....', definitely no pip3)
So I develop on a similar Linux OS/environment for the everyday work, but I'll push it to the final machine in the more portable way, if possible, in a standalone way (e.g. no need to install anything)
My Python environment was initiated with Poetry, Python 3, half a dozen of packages/lib/modules (snmp, hexdump, tftp, coloredlogs, pcap, etc.), trying to keep it PEP8 clean and now I would like to copy (rsync) my work to the production machine in a few weeks without having to unroll the whole packages & deps installation.
Did I just describe Docker? It sounds like such an overkill for what I want to achieve.
I was hoping to have poetry to take care of this, but even poetry seems to be only pip-available.
Shall I list and download my packages and dependencies manually and install them manually? (But the pip 3 installation on that machine must be so old, I feel I'll hit dependencies waterfalls.)
Did I google that badly that I missed the best practice for such a common scenario?
I am eager to read your best hints!