Is there a way that I can get cuda context memory usage rather than having to use cudaMemGetInfo which only reports global information of a device? or at least a way to get how much memory is occupied by the current application?
Asked
Active
Viewed 748 times
1
-
In short: [no](https://devtalk.nvidia.com/default/topic/1044191/determine-memory-cuda-context-memory-usage/?offset=5) – IGarFieldI Nov 12 '19 at 08:12
-
2nvidia-smi provides this information (per-process memory usage) for Tesla and Quadro GPUs. Which means it should be possible to retrieve it using NVML. – Robert Crovella Nov 12 '19 at 14:09