I'm not a LightGBM expert, so it might be better to wait for some to chime in. But from what I've been able to find, lightGBM does not really work with both Dask and GPU support.
See https://github.com/microsoft/LightGBM/issues/4761#issuecomment-956358341:
Right now the dask interface doesn't directly support distributed training using GPU, you can subscribe to #3776 if you're interested in that. Are you getting any warnings about this? I think it probably isn't using the GPU at all.
Furthermore, if your data fits in a single machine then it's probably best not using distributed training at all. The dask interface is there to help you train a model on data that doesn't fit on a single machine by having partitions of the data on different machines which communicate with each other, which adds some overhead compared to single-node training.
And https://github.com/microsoft/LightGBM/issues/3776:
The Dask interface in https://github.com/microsoft/LightGBM/blob/706f2af7badc26f6ec68729469ec6ec79a66d802/python-package/lightgbm/dask.py currently only supports CPU-based training.
Anyway, if you have only one GPU, Dask shouldn't be of much help.