I am a new bee to spark. I am trying to write the pyspark dataframe into mysql database. I am getting a null pointer exception.
Is this any how related Avro format which is mentioned Spark giving Null Pointer Exception while performing jdbc save.
Please let me know how to resolve this error.
Code:
spark = SparkSession.builder.master("spark://*******:7077").appName("amazon-insights").config("spark.executor.memory", "6gb").config("spark.jars","/usr/local/spark/jars/mysql-connector-java-5.1.45-bin.jar").getOrCreate()
reviews.write.jdbc(url="jdbc:mysql:://**host***:3306/sparkjobs",table="reviews",mode="append",properties={"driver": "com.mysql.jdbc.Driver","user":"****","password":"****"})
Error:
py4j.protocol.Py4JJavaError: An error occurred while calling o545.jdbc. : java.lang.NullPointerException at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:99) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)