0

I'm using pySpark in the Spark service in Bluemix to transform something in my data and then write it in DashDB, also in Bluemix. But when I try to load data I receive the following error:

Py4JJavaError: An error occurred while calling o111.jdbc. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 4.0 failed 10 times, most recent failure: Lost task 0.9 in stage 4.0 (TID 23, yp-spark-dal09-env5-0045): com.ibm.db2.jcc.am.BatchUpdateException: [jcc][t4][102][10040][4.19.26] Batch failure. The batch was submitted, but at least one exception occurred on an individual member of the batch. Use getNextException() to retrieve the exceptions for specific batched elements. ERRORCODE=-4229, SQLSTATE=null

I already tried to create a new DashDB but I receive the same error. Then, I tried to create another notebook, but it also won't works. I tried to stop all kernels and run just a Kernel per time, won't works.

All notebooks that I try to write data from a data frame I receive the same error but, in same cases, the data is loaded and in others isn't.

This is the code that I'm using to write:

for num in range(0,22):
(shared_df_first
     .select(
        Func.col('Customer').alias("CUSTOMER"),
        Func.col('Customer Environment').alias("CUST_ENV"),
        Func.col('Device').alias("DEVICE"),
        Func.col('User ID').alias("USER_ID"),
        Func.col('Date').alias("DATE"),
        Func.col('Time').alias("TIME"),
        Func.col('Requester').alias("REQUESTER"),
        Func.col('Manager').alias("MANAGER"),
        Func.col('Manager Mail').alias("MANAGER_MAIL"),
        Func.col('Ticket').alias("TICKET"),
        Func.col('Request').alias("REQUEST"),
        Func.col('Teste').alias("TESTE"),
        Func.col('Approver USERID').alias("APPROVER_USERID"),
        Func.col('Approver Name').alias("APPROVER_NAME"),
        Func.col('Period in hours').alias("PERIOD"),
        Func.col('Business Need').alias("BUSINESS_NEED"),
        Func.col('Password Periodic Changable').alias("PASSWORD_PERIODIC_CHANGABLE"),
        Func.col('Is Pre Approved?').alias("IS_PRE_APPROVED"),
        Func.col('Has Personal User ID?').alias("HAS_PERSONAL_USER_ID"),
        Func.col('Check in way').alias("CHECK_IN_WAY"),
        Func.col('SLA').alias("SLA"),
        Func.col('Invalid Business Need').alias("BUSINESS_NEED_INVALID")
    )
    .write
    .jdbc("jdbc:db2://bluemix05.bluforcloud.com:50000/BLUDB", "DASH014638.WATSON_UAT_DEV", "append", propertiesDBDash
)
print num + 1



df = ds_clean.toDF(["account_id","customer","device_name","device_os","user_id","user_id_type","creation_date","last_logon","password_is_never_expires","responsible",
                "privileges","user_id_status"])
propertiesDBDash = {
                "user":"dash014638",
                "password":"pwd"}
df.write.mode("append").jdbc("jdbc:db2://bluemix05.bluforcloud.com:50000/BLUDB", "DASH014638.DORMANT_PROD",properties=propertiesDBDash)

df = ds_clean.toDF(["REQUEST_NUMBER","TYPE_TICKET","SOLUTIONER","CUSTOMER","DELIVERY","OPEN_DATE","OPEN_TIME","CLOSE_DATE","CLOSE_TIME","SERVICE","DEVICE","PLATFORM","REQUESTER","REQUESTER_MANAGER_MAIL","SLA","ELAPSED_TIME","SLA_STATUS","URGENCY","ACTION","REQUEST_STATUS"])

propertiesDBDash = {
                "user":"dash014638",
                "password":"pwd"}
df.write.mode("append").jdbc("jdbc:db2://bluemix05.bluforcloud.com:50000/BLUDB", "DASH014638.WATSON_REQUEST_NEW",properties=propertiesDBDash)

1 Answers1

0

It's looks like that Bluemix doesn't show all errors retrieved by dashDB. So the real problem here is because I'm trying to insert data bigger than the data field specified in dashDB.