-1

I tried to install llama with pip:

pip install llama

But I got:

Collecting llama
  Using cached llama-0.1.1.tar.gz (387 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [21 lines of output]
      Traceback (most recent call last):
        File "D:\python\python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "D:\python\python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "D:\python\python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\spark\AppData\Local\Temp\pip-build-env-87x4skmg\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\spark\AppData\Local\Temp\pip-build-env-87x4skmg\overlay\Lib\site-packages\setuptools\build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "C:\Users\spark\AppData\Local\Temp\pip-build-env-87x4skmg\overlay\Lib\site-packages\setuptools\build_meta.py", line 488, in run_setup
          self).run_setup(setup_script=setup_script)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\spark\AppData\Local\Temp\pip-build-env-87x4skmg\overlay\Lib\site-packages\setuptools\build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 6, in <module>
      NameError: name 'execfile' is not defined
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Looks like the main problem is 'execfile' is not available.

NameError: name 'execfile' is not defined

I also tried to install with pycodestyle, but error is still there.

linusz
  • 743
  • 1
  • 14
  • 26
w s
  • 11

1 Answers1

0

execfile was removed in Python 3. The code seems to be Python2-only. At the https://pypi.org/project/llama/ the only supported version is Python 2.7 (see the bottom of the left column).

What package do you want to install? There're many llamas at PyPI: https://pypi.org/search/?q=llama

phd
  • 82,685
  • 13
  • 120
  • 165