0

I'm training a longformer model and getting many of these messages:

Initializing global attention on multiple choice...
Input ids are automatically padded from 412 to 512 to be a multiple of `config.attention_window`: 512
Initializing global attention on multiple choice...
Input ids are automatically padded from 448 to 512 to be a multiple of `config.attention_window`: 512
...

I found this source code for the model:

if global_attention_mask is None and input_ids is not None:
    logger.info("Initializing global attention on multiple choice...")

And also:

if padding_len > 0:
        logger.info(
            f"Input ids are automatically padded from {seq_len} to {seq_len + padding_len} to be a multiple of "
            f"`config.attention_window`: {attention_window}"

These both seem to be under logging. I followed this huggingface link on suppressing logging using:

from transformers import logging as hf_logging 
hf_logging.set_verbosity_error() 

But this didn't seem to work. I'd add the code I'm using but it's exceptionally long, so if there's a specific question about it I'd be happy to answer/share particular code snippets, but I'm mainly following this tutorial, just with the longformer architecture.

Penguin
  • 1,923
  • 3
  • 21
  • 51

0 Answers0