0

I want to manage dependencies for a python package being deployed across architectures/OS, without needing to compile any requirements from source - especially important when more development occurs on Mac M1 arm64 chips.

Is there a tool/dependency manager that can identify a series of pre-built package versions which support a list of architectures? i.e. both the development and target deployment environments?

Related question about python-poetry for development on x86_64 and deployment on arm7vl - not possible.

Related blog post on managing multiple conda installations for different architectures - not ideal, would rather manage a single set of dependencies.

  • 2
    Not sure, but maybe you need something more heavyweight like a Docker container or virtual machine – Chris_Rands Feb 10 '23 at 10:12
  • There is no cross-compilation in the Python world. You need to run compilers and Python packaging tools on different architectures (processor type + OS + Python version) to prepare binary wheels for corresponding platforms. Qemu, VirtualBox, Docker+qemu, etc. Example: https://github.com/CheetahTemplate3/cheetah3/releases — I provide a number of binary wheels for Cheetah3; I compile them [on the corresponding platforms](https://github.com/CheetahTemplate3/cheetah3/blob/074be1bb8caabf577a7dccc69eef817e32f2f305/.github/workflows/test-publish.yaml#L15-L16). – phd Feb 10 '23 at 12:23
  • @Chris_Rands do you mean a multi-platform image? – BadgerBadgerBadger Feb 10 '23 at 16:05
  • @phd I'm not looking for cross-compilation, instead something that finds compatible pre-built modules that wouldn't require compilation on any of a few specified architectures. – BadgerBadgerBadger Feb 10 '23 at 16:06
  • I am not aware of any tool. But if you use poetry, the lock file will list for each library all the available wheels. You can then check if wheels are available for all the architecture you are targeting. But it will take some custom string parsing. – 0x26res Feb 11 '23 at 12:21

0 Answers0