2

I am trying to write data to db2 via pyspark and want to get better error messages from failures. I know I can reach java errors like this:

from py4j.protocol import Py4JJavaError

try:
    data_frame.write.jdbc('jdbc...', table='some_table', properties='my jdbc properties')
except Py4JJavaError as err:
    print(err.java_exception)

But this returns something along the lines of

com.ibm.db2.jcc.am.BatchUpdateException...
Use getNextException() to retrieve the exceptions for specific batched elements.

Is there a way to use getNextException() via pyspark to get the details of the error?

user3124181
  • 782
  • 9
  • 24
  • It worked for me with `str(err)` which gives the full stacktrace including the causing exception ([doc string](https://github.com/bartdag/py4j/blob/a05c40b7e89e48b5a57d989b26b5f38759480783/py4j-python/src/py4j/protocol.py#L449)). – Suzana Jul 15 '19 at 15:04

0 Answers0