I am working with DeepLearning4j library. I am running everything on HPC and I generate a jar file to submit with spark-submit. I am using the version M1.1. Everything was fine with the CPU but when I switched to GPU, I got this error:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.nd4j.jita.concurrency.CudaAffinityManager.getNumberOfDevices(CudaAffinityManager.java:136)
at org.nd4j.jita.constant.ConstantProtector.purgeProtector(ConstantProtector.java:60)
at org.nd4j.jita.constant.ConstantProtector.<init>(ConstantProtector.java:53)
at org.nd4j.jita.constant.ConstantProtector.<clinit>(ConstantProtector.java:41)
at org.nd4j.jita.constant.ProtectedCudaConstantHandler.<clinit>(ProtectedCudaConstantHandler.java:69)
at org.nd4j.jita.constant.CudaConstantHandler.<clinit>(CudaConstantHandler.java:38)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:62)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:56)
at org.nd4j.linalg.factory.Nd4j.initWithBackend(Nd4j.java:5152)
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5093)
at org.nd4j.linalg.factory.Nd4j.<clinit>(Nd4j.java:270)
at org.datavec.image.loader.NativeImageLoader.transformImage(NativeImageLoader.java:670)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:593)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:281)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:256)
at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:250)
at org.datavec.image.recordreader.BaseImageRecordReader.next(BaseImageRecordReader.java:247)
at org.datavec.image.recordreader.BaseImageRecordReader.nextRecord(BaseImageRecordReader.java:511)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.initializeUnderlying(RecordReaderDataSetIterator.java:194)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:341)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:421)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:53)
at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.entryPoint(NetworkRetrainingMain.java:55)
at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.main(NetworkRetrainingMain.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: ND4J is probably missing dependencies. For more information, please refer to: https://deeplearning4j.konduit.ai/nd4j/backend
at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:116)
at org.nd4j.nativeblas.NativeOpsHolder.<clinit>(NativeOpsHolder.java:37)
... 38 more
Caused by: java.lang.UnsatisfiedLinkError: no jnind4jcuda in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1718)
at org.bytedeco.javacpp.Loader.load(Loader.java:1328)
at org.bytedeco.javacpp.Loader.load(Loader.java:1132)
at org.nd4j.nativeblas.Nd4jCuda.<clinit>(Nd4jCuda.java:10)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:62)
at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:56)
at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:88)
... 39 more
Caused by: java.lang.UnsatisfiedLinkError: /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libjnind4jcuda.so: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libnd4jcuda.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1668)
... 47 more
My pom.xml file is:
<properties>
<dl4j-master.version>1.0.0-M1.1</dl4j-master.version>
<!-- Change the nd4j.backend property to nd4j-cuda-X-platform to use CUDA GPUs -->
<nd4j.backend>nd4j-cuda-11.2-platform</nd4j.backend>
<java.version>1.8</java.version>
<shadedClassifier>bin</shadedClassifier>
<scala.binary.version>2.11</scala.binary.version>
<maven-compiler-plugin.version>3.8.1</maven-compiler-plugin.version>
<maven.minimum.version>3.3.1</maven.minimum.version>
<exec-maven-plugin.version>1.4.0</exec-maven-plugin.version>
<maven-shade-plugin.version>2.4.3</maven-shade-plugin.version>
<jcommon.version>1.0.23</jcommon.version>
<jfreechart.version>1.0.13</jfreechart.version>
<logback.version>1.1.7</logback.version>
<jcommander.version>1.27</jcommander.version>
<spark.version>2.4.8</spark.version>
<jackson.version>2.5.1</jackson.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${exec-maven-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>java</executable>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven-shade-plugin.version}</version>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>${shadedClassifier}</shadedClassifierName>
<createDependencyReducedPom>true</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>org/datanucleus/**</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<!-- Added to enable jar creation using mvn command-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<archive>
<manifest>
<mainClass>fully.qualified.MainClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>cuda-platform-redist</artifactId>
<version>11.2-8.1-1.5.5</version>
</dependency>
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>nd4j-cuda-11.2-platform</artifactId>
<version>1.0.0-M1.1</version>
</dependency>
<dependency>
<groupId>org.datavec</groupId>
<artifactId>datavec-spark_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>dl4j-spark_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>dl4j-spark-parameterserver_${scala.binary.version}</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>${jcommander.version}</version>
</dependency>
<!-- Used for patent classification example -->
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-nlp</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-zoo</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-core</artifactId>
<version>1.0.0-M1.1</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-cuda-11.2</artifactId>
<version>1.0.0-M1.1</version>
</dependency>
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.10.2</version>
</dependency>
</dependencies>
the modules that I am loading are:
1) modenv/scs5 (S) 22) libfabric/1.11.0-GCCcore-10.2.0 43) HDF5/1.10.7-gompic-2020b
2) Maven/3.6.3 23) PMIx/3.1.5-GCCcore-10.2.0 44) cURL/7.72.0-GCCcore-10.2.0
3) Java/1.8.0_161-OpenJDK 24) OpenMPI/4.0.5-gcccuda-2020b 45) double-conversion/3.1.5-GCCcore-10.2.0
4) BigDataFrameworkConfigure/0.0.2 25) OpenBLAS/0.3.12-GCC-10.2.0 46) flatbuffers/1.12.0-GCCcore-10.2.0
5) Spark/3.0.1-Hadoop-2.7-Java-1.8-Python-3.7.4-GCCcore-8.3.0 26) gompic/2020b 47) giflib/5.2.1-GCCcore-10.2.0
6) GCCcore/10.2.0 27) FFTW/3.3.8-gompic-2020b 48) ICU/67.1-GCCcore-10.2.0
7) zlib/1.2.11-GCCcore-10.2.0 28) ScaLAPACK/2.1.0-gompic-2020b 49) JsonCpp/1.9.4-GCCcore-10.2.0
8) binutils/2.35-GCCcore-10.2.0 29) fosscuda/2020b 50) NASM/2.15.05-GCCcore-10.2.0
9) GCC/10.2.0 30) cuDNN/8.0.4.30-CUDA-11.1.1 51) libjpeg-turbo/2.0.5-GCCcore-10.2.0
10) CUDAcore/11.1.1 31) NCCL/2.8.3-GCCcore-10.2.0-CUDA-11.1.1 52) LMDB/0.9.24-GCCcore-10.2.0
11) CUDA/11.1.1-GCC-10.2.0 32) bzip2/1.0.8-GCCcore-10.2.0 53) nsync/1.24.0-GCCcore-10.2.0
12) gcccuda/2020b 33) ncurses/6.2-GCCcore-10.2.0 54) PCRE/8.44-GCCcore-10.2.0
13) numactl/2.0.13-GCCcore-10.2.0 34) libreadline/8.0-GCCcore-10.2.0 55) protobuf/3.14.0-GCCcore-10.2.0
14) XZ/5.2.5-GCCcore-10.2.0 35) Tcl/8.6.10-GCCcore-10.2.0 56) protobuf-python/3.14.0-GCCcore-10.2.0
15) libxml2/2.9.10-GCCcore-10.2.0 36) SQLite/3.33.0-GCCcore-10.2.0 57) flatbuffers-python/1.12-GCCcore-10.2.0
16) libpciaccess/0.16-GCCcore-10.2.0 37) GMP/6.2.0-GCCcore-10.2.0 58) typing-extensions/3.7.4.3-GCCcore-10.2.0
17) hwloc/2.2.0-GCCcore-10.2.0 38) libffi/3.3-GCCcore-10.2.0 59) libpng/1.6.37-GCCcore-10.2.0
18) libevent/2.1.12-GCCcore-10.2.0 39) Python/3.8.6-GCCcore-10.2.0 60) snappy/1.1.8-GCCcore-10.2.0
19) Check/0.15.2-GCCcore-10.2.0 40) pybind11/2.6.0-GCCcore-10.2.0 61) TensorFlow/2.4.1-fosscuda-2020b
20) GDRCopy/2.1-GCCcore-10.2.0-CUDA-11.1.1 41) SciPy-bundle/2020.11-fosscuda-2020b
21) UCX/1.9.0-GCCcore-10.2.0-CUDA-11.1.1 42) Szip/2.1.1-GCCcore-10.2.0
Anyone has any idea about how to solve this. Thank you!