0

I want to use transformers==3.4.0 on a Linux system, and got the error

this version of Cargo is older than the `2021` edition, and only supports `2015` and `2018` editions.
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib` failed with code 101

Here is the full message I got for pip install transformers==3.4.0 :

Collecting transformers==3.4.0
  Using cached transformers-3.4.0-py3-none-any.whl (1.3 MB)
Requirement already satisfied: regex!=2019.12.17 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (2023.3.23)
Requirement already satisfied: packaging in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (23.0)
Requirement already satisfied: sacremoses in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (0.0.43)
Requirement already satisfied: protobuf in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (3.20.3)
Requirement already satisfied: filelock in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (3.10.2)
Requirement already satisfied: requests in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (2.28.2)
Requirement already satisfied: tqdm>=4.27 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (4.65.0)
Collecting tokenizers==0.9.2
  Using cached tokenizers-0.9.2.tar.gz (170 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: sentencepiece!=0.1.92 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (0.1.95)
Requirement already satisfied: numpy in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from transformers==3.4.0) (1.19.2)
Requirement already satisfied: idna<4,>=2.5 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from requests->transformers==3.4.0) (3.4)
Requirement already satisfied: charset-normalizer<4,>=2 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from requests->transformers==3.4.0) (3.1.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from requests->transformers==3.4.0) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from requests->transformers==3.4.0) (2022.12.7)
Requirement already satisfied: joblib in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from sacremoses->transformers==3.4.0) (1.2.0)
Requirement already satisfied: six in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from sacremoses->transformers==3.4.0) (1.16.0)
Requirement already satisfied: click in /home1/xzhu9839/fairseq/seq_env/lib/python3.9/site-packages (from sacremoses->transformers==3.4.0) (8.1.3)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [56 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-x86_64-cpython-39
      creating build/lib.linux-x86_64-cpython-39/tokenizers
      copying py_src/tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers
      creating build/lib.linux-x86_64-cpython-39/tokenizers/models
      copying py_src/tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/models
      creating build/lib.linux-x86_64-cpython-39/tokenizers/decoders
      copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/decoders
      creating build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
      copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
      creating build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
      copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
      creating build/lib.linux-x86_64-cpython-39/tokenizers/processors
      copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/processors
      creating build/lib.linux-x86_64-cpython-39/tokenizers/trainers
      copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/trainers
      creating build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-39/tokenizers/implementations
      copying py_src/tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers
      copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/models
      copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/decoders
      copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/normalizers
      copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/pre_tokenizers
      copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/processors
      copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-39/tokenizers/trainers
      running build_ext
      running build_rust
      cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib
      warning: unused manifest key: target.x86_64-apple-darwin.rustflags
          Updating crates.io index
          Updating git repository `https://github.com/pyo3/rust-numpy/`
          Updating git repository `https://github.com/n1t0/rayon-cond`
       Downloading crates ...
      error: failed to download `once_cell v1.17.1`

      Caused by:
        unable to get packages from source

      Caused by:
        failed to parse manifest at `/home1/xzhu9839/.cargo/registry/src/github.com-1ecc6299db9ec823/once_cell-1.17.1/Cargo.toml`

      Caused by:
        failed to parse the `edition` key

      Caused by:
        this version of Cargo is older than the `2021` edition, and only supports `2015` and `2018` editions.
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

Here are the results for cargo -vV and rustc -vV

cargo 1.68.1 (115f34552 2023-02-26)
release: 1.68.1
commit-hash: 115f34552518a2f9b96d740192addbac1271e7e6
commit-date: 2023-02-26
host: x86_64-unknown-linux-gnu
libgit2: 1.5.0 (sys:0.16.0 vendored)
libcurl: 7.86.0-DEV (sys:0.4.59+curl-7.86.0 vendored ssl:OpenSSL/1.1.1q)
os: CentOS 7.0.0 [64-bit]

and

cargo 1.68.1 (115f34552 2023-02-26)
release: 1.68.1
commit-hash: 115f34552518a2f9b96d740192addbac1271e7e6
commit-date: 2023-02-26
host: x86_64-unknown-linux-gnu
libgit2: 1.5.0 (sys:0.16.0 vendored)
libcurl: 7.86.0-DEV (sys:0.4.59+curl-7.86.0 vendored ssl:OpenSSL/1.1.1q)
os: CentOS 7.0.0 [64-bit]
[xzhu9839@e21-07 MMPT]$ rustc -vV
rustc 1.68.1 (8460ca823 2023-03-20)
binary: rustc
commit-hash: 8460ca823e8367a30dda430efda790588b8c84d3
commit-date: 2023-03-20
host: x86_64-unknown-linux-gnu
release: 1.68.1
LLVM version: 15.0.6

I am using gcc==8.3.0, python==3.6.8.

I downgraded rust to 1.58.1/1.56.1, and it does not work Same problem occurred for python 3.9.12

cafce25
  • 15,907
  • 4
  • 25
  • 31
11D
  • 1
  • 2
    Well cargo 1.68.1 definitely supports 2021, so I think the one python is calling is not the same as the one you're calling. – drewtato Mar 24 '23 at 21:56

0 Answers0