16

I have a lot of issues withe the tqdm progress bar in Pytorch Lightning:

  • when I run trainings in a terminal, the progress bars overwrite themselves. At the end of an training epoch, a validation progress bar is printed under the training bar, but when that ends, the progress bar from the next training epoch is printed over the one from the previous epoch. hence it's not possible to see the losses from previous epochs.
INFO:root:  Name    Type Params
0   l1  Linear    7 K
Epoch 2:  56%|████████████▊          | 2093/3750 [00:05<00:03, 525.47batch/s, batch_nb=1874, loss=0.714, training_loss=0.4, v_nb=51]
  • the progress bars wobbles from left to right, caused by the changing in number of digits behind the decimal point of some losses.
  • when running in Pycharm, the validation progress bar is not printed, but in stead,
INFO:root:  Name    Type Params
0   l1  Linear    7 K
Epoch 1:  50%|█████     | 1875/3750 [00:05<00:05, 322.34batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  50%|█████     | 1879/3750 [00:05<00:05, 319.41batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  52%|█████▏    | 1942/3750 [00:05<00:04, 374.05batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  53%|█████▎    | 2005/3750 [00:05<00:04, 425.01batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  55%|█████▌    | 2068/3750 [00:05<00:03, 470.56batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  57%|█████▋    | 2131/3750 [00:05<00:03, 507.69batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  59%|█████▊    | 2194/3750 [00:06<00:02, 538.19batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  60%|██████    | 2257/3750 [00:06<00:02, 561.20batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  62%|██████▏   | 2320/3750 [00:06<00:02, 579.22batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  64%|██████▎   | 2383/3750 [00:06<00:02, 591.58batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  65%|██████▌   | 2445/3750 [00:06<00:02, 599.77batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  67%|██████▋   | 2507/3750 [00:06<00:02, 605.00batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  69%|██████▊   | 2569/3750 [00:06<00:01, 607.04batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1:  70%|███████   | 2633/3750 [00:06<00:01, 613.98batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]

I would like to know if these issues can be solved or else how can I disable the progress bar and instead, just print some log details on the screen.

Shai
  • 111,146
  • 38
  • 238
  • 371
Marc Dumon
  • 163
  • 1
  • 1
  • 8

5 Answers5

19

F.Y.I. show_progress_bar=False deprecated since version 0.7.2, but you can use progress_bar_refresh_rate=0


update:

progress_bar_refresh_rate has been deprecated in v1.5 and will be removed in v1.7. To disable the progress bar, set enable_progress_bar to false

progress_bar_refresh_rate: How often to refresh progress bar (in steps). Value ``0`` disables progress bar.
    Ignored when a custom progress bar is passed to :paramref:`~Trainer.callbacks`. Default: None, means
    a suitable value will be chosen based on the environment (terminal, Google COLAB, etc.).

    .. deprecated:: v1.5
        ``progress_bar_refresh_rate`` has been deprecated in v1.5 and will be removed in v1.7.
        Please pass :class:`~pytorch_lightning.callbacks.progress.TQDMProgressBar` with ``refresh_rate``
        directly to the Trainer's ``callbacks`` argument instead. To disable the progress bar,
        pass ``enable_progress_bar = False`` to the Trainer.

enable_progress_bar: Whether to enable to progress bar by default.
sralvins
  • 189
  • 1
  • 5
7

Use the command show_progress_bar=False in Trainer.

Davide Fiocco
  • 5,350
  • 5
  • 35
  • 72
5

I would like to know if these issues can be solved or else how can I disable the progress bar and instead, just print some log details on the screen.

As far as I know this problem is still unresolved. The pl team points that it is "TQDM related thing" and they can't do anything about it. Maybe you want to read this issue

My temporary fix is :

from tqdm import tqdm

class LitProgressBar(ProgressBar):
   
    def init_validation_tqdm(self):
        bar = tqdm(            
            disable=True,            
        )
        return bar

bar = LitProgressBar()
trainer = Trainer(callbacks=[bar])

This method simply disables the validation progress bar and allows you to keep the correct training bar [refer 1 and 2]. Note that using progress_bar_refresh_rate=0 will disable the update of all progress bars.

Further solution: (Update1 2021-07-22)

According to this answer, the tqdm seems to only be glitching in the PyCharm console. So, a possible answer is to do something with Pycharm setting. Fortunately, I find this answer

  1. Go to "Edit configurations". Click on the run/debug configuration that is being used. There should be an option "Emulate terminal in output console". Check that. Image added for reference. image

  2. Along with the position argument also set the leave argument. The code should look like this. I have added ncols so that the progress bar doesn't take up whole of the console.

from tqdm import tqdm import time
for i in tqdm(range(5), position=0, desc="i", leave=False, colour='green', ncols=80):
    for j in tqdm(range(10), position=1, desc="j", leave=False, colour='red', ncols=80):
        time.sleep(0.5) 

When the code is now run, the output of the console is as shown below.

i:  20%|████████▍                                 | 1/5 [00:05<00:20,  5.10s/it] 
j:  60%|████████████████████████▌            | 6/10 [00:03<00:02,  1.95it/s] 

Follwing the above two steps, we can display the progress bar in the Pycharm in the normal way. To finish the step2 in the Pytorch-lightning, we need to override functions init_train_tqdm(), init_validation_tqdm(), init_test_tqdm() to change the ncols. Some codes like this (hope it helps):

class LitProgressBar(ProgressBar):

    def init_train_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for training. """
        bar = tqdm(
            desc='Training',
            initial=self.train_batch_idx,
            position=(2 * self.process_position),
            disable=self.is_disabled,
            leave=True,
            dynamic_ncols=False,  # This two lines are only for pycharm
            ncols=100,
            file=sys.stdout,
            smoothing=0,
        )
        return bar

    def init_validation_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for validation. """
        # The main progress bar doesn't exist in `trainer.validate()`
        has_main_bar = self.main_progress_bar is not None
        bar = tqdm(
            desc='Validating',
            position=(2 * self.process_position + has_main_bar),
            disable=self.is_disabled,
            leave=False,
            dynamic_ncols=False,
            ncols=100,
            file=sys.stdout
        )
        return bar

    def init_test_tqdm(self) -> tqdm:
        """ Override this to customize the tqdm bar for testing. """
        bar = tqdm(
            desc="Testing",
            position=(2 * self.process_position),
            disable=self.is_disabled,
            leave=True,
            dynamic_ncols=False,
            ncols=100,
            file=sys.stdout
        )
        return bar

If it does not work for you, please update the version of Pytorch-lightning to the lastest one.

lollows
  • 71
  • 1
  • 2
  • 5
  • Recently, I noticed that [progress_bar_refresh_rate](https://pytorch-lightning.readthedocs.io/en/latest/common/trainer.html#init) will be ignored when a custom progress bar is passed to callbacks in `Trainer`. So if you want to change the refresh rate of progress bar, please use 'bar = LitProgressBar(refresh_rate=your_refresh_rate)' instead of 'bar = LitProgressBar()' – lollows Mar 23 '21 at 02:25
  • Are you using `ProgressBar` instead of tqdm? ```class LitProgressBar(ProgressBar)```. I get ```NameError: name 'ProgressBar' is not defined``` – Sameer Nov 30 '21 at 23:04
2

to disable the progress bar pass enable_progress_bar = False to the Trainer.

0

If you like to turn off the progress bar, you can use this in Trainer. Setting parameter of "progress_bar_refresh_rate" to 0 will disable the progress bar, however this setting will be omitted if you specify your own progress bar in callback. Note that pl is pytorch lightning module (import pytorch_lightning as pl) which may different from your style.

trainer = pl.Trainer(..., progress_bar_refresh_rate=0)