I'm trying to figure out what is the minimum GPU requirement for my application. Using nvidia-smi
as described here in Colab gives me a maximum value for memory.used
of 4910MiB. So I presume a 4GB GPU is not enough, correct ?
Also on this.. after the execution (so no process using GPU now) that value is still reported executing nvidia-smi
, does this mean that the value reported for memory.used is intended as the maximum reached value (I would use this as lower limit for my requirements) or it is due to the pytorch caching policy?
how to interpret memory.used in nvidia-smi for pytorch in order to estimate minimum GPU requirements
Asked
Active
Viewed 35 times
0

rok
- 2,574
- 3
- 23
- 44