2

Every time I run GPT-2, I am receiving this message. Is there a way I can get this to go away?

Some weights of GPT2LMHeadModel were not initialized from the model checkpoint at gpt2 and are newly initialized: ['h.0.attn.masked_bias', 'h.1.attn.masked_bias', 'h.2.attn.masked_bias', 'h.3.attn.masked_bias', 'h.4.attn.masked_bias', 'h.5.attn.masked_bias', 'h.6.attn.masked_bias', 'h.7.attn.masked_bias', 'h.8.attn.masked_bias', 'h.9.attn.masked_bias', 'h.10.attn.masked_bias', 'h.11.attn.masked_bias', 'lm_head.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Guy Coder
  • 24,501
  • 8
  • 71
  • 136
Johnny
  • 125
  • 9
  • Please provide your [minimal reproducible example](https://stackoverflow.com/help/minimal-reproducible-example) – adrtam Aug 28 '20 at 01:01

1 Answers1

2

Yes you need to change the loglevel before you import anything from the transformers library:

import logging
logging.basicConfig(level='ERROR')

from transformers import GPT2LMHeadModel

model = GPT2LMHeadModel.from_pretrained('gpt2')
cronoik
  • 15,434
  • 3
  • 40
  • 78