0

I am trying to save a spark dataframe into hbase using phoenix. I am able to read from the database but am facing an issue while trying to write into it.

Reading Data from test1

df_output <- read.jdbc(source="jdbc",url="jdbc:phoenix:azaupalphdoop02.ah.loc,azaupalphdoop03.ah.loc,azaupalphdoop04.ah.loc:2181:/hbaseunsecure",tableName="test1",driver="org.apache.phoenix.jdbc.PhoenixDriver")



showDF(df_output)

--------+-----+
|MEMBERID|MK_56|
+--------+-----+
|       1|    5|
+--------+-----+
printSchema(df_output)

root
 |-- MEMBERID: integer (nullable = false)
 |-- MK_56: integer (nullable = true)

write.jdbc(df_output,url="jdbc:phoenix:azaupalphdoop02.ah.loc,azaupalphdoop03.ah.loc,azaupalphdoop04.ah.loc:2181:/hbase-unsecure",tableName="test1",mode="append")

I am getting the following error

ERROR 601 (42P00): Syntax error. Encountered "INSERT" at line 1, column 1.

write.jdbc(df_output,url="jdbc:phoenix:azaupalphdoop02.ah.loc,azaupalphdoop03.ah.loc,azaupalphdoop04.ah.loc:2181:/hbase-unsecure",tableName="test122")

Error

java.sql.SQLException: ERROR 517 (42895): Invalid not null constraint on non primary key column. columnName=TEST122.MEMBERID

Overwrite - Drop the existing table and create the new table with the given data frame

write.jdbc(df_output,url="jdbc:phoenix:azaupalphdoop02.ah.loc,azaupalphdoop03.ah.loc,azaupalphdoop04.ah.loc:2181:/hbase-unsecure",tableName="test",mode="overwrite")    

Drops the table successfully but fails to create a new table

java.sql.SQLException: ERROR 517 (42895): Invalid not null constraint on non primary key column. columnName=TEST122.MEMBERID

I assume there is some issue with the schema but I am not able to find the solution.

piet.t
  • 11,718
  • 21
  • 43
  • 52
Stanley
  • 31
  • 2

0 Answers0