1

I want to train an OpenNLP model using the CLI on a GPU server that I access remotely. I am familiar with utilizing the GPU when training pytorch models, but I realized I'm not sure how this would work with openNLP, given that it is written in java. Will openNLP make use of the GPU if I train it on one?

Specifically, I am thinking of this familiar code snippet we use when training pytorch models:

if torch.cuda.is_available():  
  dev = "cuda:0" 

Can anyone shed some light on how this works in the java OpenNLP library? Is there an equivalent to this line of code somewhere?

I am also using this docker image to run the CLI on my remote GPU server: https://hub.docker.com/r/casetext/opennlp/dockerfile

I believe I also need to modify the dockerfile to be able to use the GPU, but I was wondering if I needed to first do anything else to the openNLP code to accomplish this regardless of my docker container usage.

tayloor
  • 13
  • 3
  • If PyTorch supports the features you need, it's now possible to use its C++ API from Java with the JavaCPP Presets for PyTorch: https://github.com/bytedeco/javacpp-presets/tree/master/pytorch – Samuel Audet Apr 16 '21 at 01:45

1 Answers1

0

Apache OpenNLP does not support training on a GPU. Training can only be done on CPU.

jzonthemtn
  • 3,344
  • 1
  • 21
  • 30