1

Is there a way that I can get cuda context memory usage rather than having to use cudaMemGetInfo which only reports global information of a device? or at least a way to get how much memory is occupied by the current application?

einpoklum
  • 118,144
  • 57
  • 340
  • 684
Pittie
  • 191
  • 1
  • 13
  • In short: [no](https://devtalk.nvidia.com/default/topic/1044191/determine-memory-cuda-context-memory-usage/?offset=5) – IGarFieldI Nov 12 '19 at 08:12
  • 2
    nvidia-smi provides this information (per-process memory usage) for Tesla and Quadro GPUs. Which means it should be possible to retrieve it using NVML. – Robert Crovella Nov 12 '19 at 14:09

1 Answers1

0

It seems to be impossible [No]. However, retrieving per-process memory usage is still a good alternative. And as Robert has pointed out, per-process memory usage can be retrieved using NVML, specifically, by using nvmlDeviceGetComputeRunningProcesses function.

Pittie
  • 191
  • 1
  • 13