9

I'm running a ubuntu gpu instance on AWS ec2. I'm not sure if my application is using gpu accelerations. So how to check gpu usages on aws gpu instance?

aplrh
  • 91
  • 1
  • 1
  • 2

2 Answers2

5
  • Use: nvidia-smi -h to see the options.

  • Display info arguments:

Display only selected information: MEMORY,
                                    UTILIZATION, ECC, TEMPERATURE, POWER, CLOCK,
                                    COMPUTE, PIDS, PERFORMANCE, SUPPORTED_CLOCKS,
                                    PAGE_RETIREMENT, ACCOUNTING, ENCODER STATS 
  • Example: nvidia-smi --id=0 --loop=5 --query --display=UTILIZATION

    • --id=0 the number of the GPU. Use nvidia-smi --list-gpus to get the list of GPUs
    • --query display GPU or unit information
    • --loop=5 repeat the query every 5 seconds.
    • -display=UTILIZATION display only utilization
  • The output is something like:

==============NVSMI LOG==============

Timestamp                           : Thu Apr 11 03:48:37 2019
Driver Version                      : 384.183
CUDA Version                        : 9.0

Attached GPUs                       : 1
GPU 00000000:00:1E.0
    Utilization
        **Gpu                         : 9 %**
        Memory                      : 11 %
        Encoder                     : 0 %
        Decoder                     : 0 %
    GPU Utilization Samples
        Duration                    : 18446744073709.22 sec
        Number of Samples           : 99
        Max                         : 10 %
        Min                         : 0 %
        Avg                         : 0 %
    Memory Utilization Samples
        Duration                    : 18446744073709.22 sec
        Number of Samples           : 99
        Max                         : 14 %
        Min                         : 0 %
        Avg                         : 0 %
    ENC Utilization Samples
        Duration                    : 18446744073709.22 sec
        Number of Samples           : 99
        Max                         : 0 %
        Min                         : 0 %
        Avg                         : 0 %
    DEC Utilization Samples
        Duration                    : 18446744073709.22 sec
        Number of Samples           : 99
        Max                         : 0 %
        Min                         : 0 %
        Avg                         : 0 %

You can also log to a file (--filename=) and output CSV (--format=csv).

Tung Le
  • 51
  • 1
  • 1
3

Is this NVIDIA gear? If so, try nvidia-smi -i 3 -l -q -d to see GPU and memory utilization statistics (among other info). Notice that this only works under 1) old nvidia drivers (18X.XX), or 2) NVIDIA Tesla GPUs.

lashgar
  • 681
  • 1
  • 5
  • 16
the-wabbit
  • 40,737
  • 13
  • 111
  • 174