0

I am using kaggle code to download gpt2 language model.

from transformers import AutoTokenizer, AutoModelForCausalLM

device = "cuda" if torch.cuda.is_available() else "cpu"
model_name = "gpt2-xl"
tokenizer = AutoTokenizer.from_pretrained(model_name)

Intend to download the gpt2-xl model from the huggingface hub. But the last line raised LocalEntryNotFoundError. The detais are below.

LocalEntryNotFoundError Traceback (most recent call last)

/opt/conda/lib/python3.7/site-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash) 419 use_auth_token=use_auth_token, --> 420 local_files_only=local_files_only, 421 )

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like gpt2-xl is not the path to a directory containing a file named config.json.

Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Doesn't seem that kaggle code connects to the huggingface hub. Why does this happen and how can I fix this error?

agongji
  • 117
  • 1
  • 7

1 Answers1

0

kaggle could not download resnet50 pretrained model

This one really helped. I verified with my phone number, then got the internet connection.

agongji
  • 117
  • 1
  • 7