I'm training the gpt-2 with custom encodings and custom vocab.bpe file. However, when I generate text using gpt-2, the output tokens have range that exceeds the range of my new encodings. How can I make gpt-2 work for me then?
Asked
Active
Viewed 606 times
3
-
Could you please tell us how you solved it? I am trying to make a new `encoder.json` and `vocab.bpe` file for another language too and I couldn't find anything useful yet. Thank you. – shamiul97 May 19 '20 at 09:38