1

Company firewall seems to prevent me from just using

model = AutoModel.from_pretrained("sentence-transformers/bert-base-nli-stsb-mean-tokens")

so I need to download this model locally and then read it into Python. Couldn't find the direct AWS link, seems to be typically in this form: but did not work

https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-nli-stsb-mean-tokens-pytorch_model.bin

Tried these similar questions/solutions here but did not work, since I can't run the first line to download from pretrained in Python, I need an external solution

AxW
  • 582
  • 1
  • 6
  • 20
  • 2
    [here](https://huggingface.co/sentence-transformers/bert-base-nli-stsb-mean-tokens/tree/main) – cronoik Apr 13 '21 at 18:03

2 Answers2

1

You could try this, for example:

from transformers import AutoTokenizer, TFAutoModel

tokenizer = AutoTokenizer.from_pretrained('bert-base-multilingual-uncased')
bert=TFAutoModel.from_pretrained('bert-base-multilingual-uncased')

tokenizer.save_pretrained("./models/tokenizer/")
bert.save_pretrained("./models/bert/")
Oweys
  • 47
  • 6
1

If some model is used, there is the .cache folder which you can copy/paste to where you want, and use it in SentenceTransformer(path/to/your/folder/with/model) I am using Linux, so the .cache folder is in /home/username/.cache/torch/sentence-transformers.

Tedo Vrbanec
  • 519
  • 6
  • 12