0

I am following an IBM demo notebook. I created a new notebook in a new instance of DSX I opened in Bluemix. DSX seems to have combined my existing DSX instance with the new one.

When I try to connect to the dashDB instance, I get an error the jdbc DB2Driver class is not found. It suggests adding the jars to the /usr/local/... path. How can I add the correct DB2 Driver (db2jcc4.jar) jar to a new Bluemix instance of DSX? Do I also need the db2jcc_license_cu.jar?

  • Can you share the link to the demo notebook and indicate which kernel you are using? – ptitzler Jun 02 '17 at 15:34
  • https://apsportal.ibm.com/analytics/notebooks/b12aa9a7-3957-46d0-883f-5fc0ed300179/view?access_token=359e55b101b22e4d9936d84f7948aea1c6d5fd956b4955937132a93116582ed0 kernel is python 2 with spark 2.0 – JABrooks Jun 02 '17 at 16:01
  • @JABrooks please hide your dash credentials from notebook – charles gomes Jun 02 '17 at 21:14

1 Answers1

1

Option 1 First thing default installation of ibmdbpy is present in /usr/local/.... You cannot add db2jcc jar there. Uninstalling ibmdbpy that is installed already and then installing ibmdbpy install it in the user's(spark tenant's) .local directory.

!pip install --user lazy !pip install --user jaydebeapi !pip uninstall --yes ibmdbpy !pip install ibmdbpy --user --ignore-installed --no-deps !wget -O $HOME/.local/lib/python2.7/site-packages/ibmdbpy/db2jcc4.jar https://ibm.box.com/shared/static/lmhzyeslp1rqns04ue8dnhz2x7fb6nkc.zip

This worked. Ref:- https://github.com/ibmdbanalytics/ibmdbpy-notebooks/blob/master/ibmdbPyDemo.ipynb

Option 2

If you are okay to use alternate method, there is python connector available on DSX. https://datascience.ibm.com/docs/content/analyze-data/python_load.html#ibm-dashdb

from ingest.Connectors import Connectors


dashDBloadOptions = { Connectors.DASHDB.HOST              : 'hostname',
                  Connectors.DASHDB.DATABASE          : 'BLUDB',
                  Connectors.DASHDB.USERNAME          : 'username',
                  Connectors.DASHDB.PASSWORD          : 'XXXXX',
                  Connectors.DASHDB.SOURCE_TABLE_NAME : 'schema.MYTABLE'}


dashdbDF = sqlContext.read.format("com.ibm.spark.discover").options(**dashDBloadOptions).load()
dashdbDF.printSchema()
dashdbDF.show()

This gives you spark dataframe if thats what you are interested.

Thanks, Charles.

charles gomes
  • 2,145
  • 10
  • 15
  • I want to use ibmdbpy. I downloaded the notebook you referenced, then created a new notebook from this file, and it failed with the same error. /usr/local/src/conda3_runtime.v9/4.1.1/lib/python3.5/site-packages/jpype/_jclass.py in JClass(name) 53 jc = _jpype.findClass(name) 54 if jc is None: ---> 55 raise _RUNTIMEEXCEPTION.PYEXC("Class %s not found" % name) 56 57 return _getClassFor(jc) java.lang.ExceptionPyRaisable: java.lang.Exception: Class com.ibm.db2.jcc.DB2Driver not found – JABrooks Jun 14 '17 at 12:39