Given a transformer model on huggingface, how do I find the maximum input sequence length?
For example, here I want to truncate to the max_length of the model: tokenizer(examples["text"], padding="max_length", truncation=True)
How do I find the value of "max_length"?
I need to know because I am trying to solve this error "Asking to pad to max_length but no maximum length is provided and the model has no predefined maximum length. Default to no padding."