I wanted to improve my CNN facial recognition code by using Nvidia GPU instead of CPU. So I found and install specific dlib_cuda following these instructions.
Installation went well, and so I checked if dlib using cuda in my Python environment:
Python 3.6.9 (default, Jul 17 2020, 12:50:27)
[GCC 8.4.0] on linux
>>> import dlib
>>> dlib.DLIB_USE_CUDA
True
>>>print(dlib.cuda.get_device())
1
Because it looked good for me, I tried to use my code again, but they was no improvement, and after checking GPU, it is still not use at all. So I tried the following command:
>>> print(dlib.cuda.get_device())
And it returns:
0
I'm not sure of what these this messages means. After a lot of researches and I cannot still figure with dlib doesn't use my GPU. Is someone faced the same issue before ?
My workspace is on a Jetson AGX Xavier (Jetpack 4.4), running with Ubuntu and Cuda version 10.2.89
PS: I also use tensorFlow and Keras libraries, both of them are install to work with CUDA environments