0

I am trying to debug this error:

OutOfMemoryError: CUDA out of memory. Tried to allocate 1.12 GiB (GPU 0; 47.54 GiB total capacity; 382.34 MiB already allocated; 64.00 MiB free; 384.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

While reading this, I am not understanding what this means "384.00 MiB reserved in total by PyTorch". Does this mean Pytorch is only using 384 MiB? What does it mean that only 384 mib is reserved?

It says that I have a 47 GiB capacity. 380 mib is allocated and another 380 mib is reserved. Where did the rest of the memory go?

talonmies
  • 70,661
  • 34
  • 192
  • 269
JobHunter69
  • 1,706
  • 5
  • 25
  • 49
  • Does this answer your question? [CUDA OOM - But the numbers don't add upp?](https://stackoverflow.com/questions/70074789/cuda-oom-but-the-numbers-dont-add-upp) – ihdv May 30 '23 at 04:50

0 Answers0