i am trying to parse the JSON data, for that i wrote custom schema for this. while parsing the data by adding schema or without adding the schema i am getting following error:
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`queryResults`.`searchResponse`.`response`.`docs`.`transactions`['code']' due to data type mismatch: argument 2 requires integral type, however, ''code'' is of string type.;;
Here is my sample data:
{
"queryResults": {
"searchResponse": {
"response": {
"docs": [{
"transactions": [{
"recordDate": "2010-02-02 00:00:00",
"code": "PGM/",
"description": "Recordation of Patent Grant Mailed"
}, {
"recordDate": "2010-01-13 00:00:00",
"code": "WPIR",
"description": "Issue Notification Mailed"
}, {
"recordDate": "2009-12-17 00:00:00",
"code": "R1021",
"description": "Receipt into Pubs"
}]
}]
}
}
}
}
Here is my schema:
val schema=StructType(List(
StructField("queryResults",StructType(
List(StructField("searchResponse",StructType(
List(StructField("response",StructType(
List(StructField("docs",ArrayType(StructType(
List(
StructField("appCustNumber", StringType, nullable = true),
StructField("transactions",ArrayType(StructType(
List
(
StructField("code", StringType, nullable = true),
StructField("description", StringType, nullable = true),
StructField("recordDate", StringType, nullable = true)
)
)))
)
))))
)))
)))
))
))
Here is how my i am trying to fetch the data:
val dff = sqlCotext.read.schema(schema).json("file locatiuon")
dff.select("queryResults.searchResponse.response.docs.transactions.code").show()
Thanks in Advance.