1

When trying to load this tokenizer I am getting this error but I don't know why it can't take the ink_token strangely. Any ideas?

tokenizer = tokenizers.SentencePieceUnigramTokenizer(unk_token="", eos_token="", pad_token="")

----> 1 tokenizer = tokenizers.SentencePieceUnigramTokenizer(unk_token="", eos_token="", pad_token="")

TypeError: init() got an unexpected keyword argument 'unk_token'

Antoine23
  • 79
  • 1
  • 5

0 Answers0