0

I'm trying to make huggingface's transformer library use a model that I have downloaded that is not in the huggingface model repository.

Where does transformers look for models? Is there an equivalent of the $PATH environment variable for transformers models?

Research

This hugging face issues talks about manually downloading models.

This issue suggests that you can work around the question of where huggingface is looking for models by using the path as an argument to from_pretrained (#model = BertModel.from_pretrained('path/to/your/directory')`)

Related questions

Att Righ
  • 1,439
  • 1
  • 16
  • 29
  • 1
    you can actually specify a local path when loading with `.from_pretrained()`. Is there any hard requirement that you would natively look in a specific folder, or what does not work with that solution? – dennlinger Dec 09 '21 at 12:38
  • I think that works yes. I just sort of assumed that there was a "local" namespace for models as well as an internet one.. but it does seem to the case that local models work with paths. – Att Righ Dec 09 '21 at 12:43

0 Answers0