I am trying to write data to db2 via pyspark and want to get better error messages from failures. I know I can reach java errors like this:
from py4j.protocol import Py4JJavaError
try:
data_frame.write.jdbc('jdbc...', table='some_table', properties='my jdbc properties')
except Py4JJavaError as err:
print(err.java_exception)
But this returns something along the lines of
com.ibm.db2.jcc.am.BatchUpdateException...
Use getNextException() to retrieve the exceptions for specific batched elements.
Is there a way to use getNextException()
via pyspark to get the details of the error?