0

I started GPU computing by mxnetR in windows 10.

Simple question is if mx.mlp with mx.gpu use multiple cores in GPU. I seems not...

Also as a test, I wrote a simple program of mx.mlp, with doParallel. But it seems not to run the program in multiple cores. only 1 core of GPU usage was increased.

Please give me your ideas on how to ue multiple cores in GPU to maximize a value of GPU computing by mx.mlp with mx.gpu.

ohtant
  • 1
  • 1
  • The point of using the GPU is that you don't need to use multiple _CPU_ cores. – Hong Ooi Sep 06 '17 at 05:39
  • Thank you for your reply. So you mean that mx.mlp with GPU use only one engine of GPU for the calculation, rather than multiple CPU? I mentioned core. It was wrong, what I mean is engine, not core. – ohtant Sep 06 '17 at 06:21

1 Answers1

0

When running mxnet with GPU, mxnet will use many cores simultaneously by determining which math operations can be run in parallel.

A simple metric to reassure yourself that you're getting value-for-money from the GPU is to use the nvidia-smi command to watch GPU utilization.