2

I have tried this but it's not working for me. I am using this Git repo. I am building a desktop app and don't want users to download model. I want to ship models with build. I know transformers library looks for models in cache/torch/transformers. If it's not there then download it. I also know you can pass cache_dir parameter in pre_trained. I am trying this.

cache = os.path.join(os.path.abspath(os.getcwd()), 'Transformation/Annotators/New Sentiment Analysis/transformers')
os.environ['TRANSFORMERS_CACHE'] = cache


if args.model_name_or_path is None:
    args.model_name_or_path = 'barissayil/bert-sentiment-analysis-sst'
#Configuration for the desired transformer model
config = AutoConfig.from_pretrained(args.model_name_or_path, cache_dir=cache)

I have tried the solution in above mentioned question and tried cache_dir also. The transformer folder is in same directory with analyze.py. The whole repo and transformer folder is in New Sentiment Analysis directory.

halfer
  • 19,824
  • 17
  • 99
  • 186
shahid hamdam
  • 751
  • 1
  • 10
  • 24

1 Answers1

6

You actually haven't showed the code which is not working but I assume you did something like the following:

from transformers import AutoConfig

import os
os.environ['TRANSFORMERS_CACHE'] = '/blabla/cache/'

config = AutoConfig.from_pretrained('barissayil/bert-sentiment-analysis-sst')

os.path.isdir('/blabla/cache/')

Output:

False

This will not create a new default location for caching, because you imported the transformers library before you have set the environment variable (I modified your linked question to make it more clear). The proper way to modify the default caching directory is setting the environment variable before importing the transformers library:

import os
os.environ['TRANSFORMERS_CACHE'] = '/blabla/cache/'

from transformers import AutoConfig

config = AutoConfig.from_pretrained('barissayil/bert-sentiment-analysis-sst')

os.path.isdir('/blabla/cache/')

Output:

True
cronoik
  • 15,434
  • 3
  • 40
  • 78