I'm sending JSON data from Apache Spark / Databricks to an API. The API is expecting the data in the following JSON format:
Sample:
{
"CtcID": 1,
"LastName": "sample string 2",
"CpyID": 3,
"HonTitle": "sample string 4",
"PositionCode": 1,
"PositionFreeText": "sample string 6",
"CreateDate": "2021-04-21T08:50:56.8602571+01:00",
"ModifDate": "2021-04-21T08:50:56.8602571+01:00",
"ModifiedBy": 1,
"SourceID": "sample string 9",
"OriginID": "sample string 10",
"DoNotExport": true,
"ParentEmailAddress": "sample string 13",
"SupInfo": [
{
"FieldName": "sample string 1",
"DATA_TYPE": "sample string 2",
"IS_NULLABLE": "sample string 3",
"FieldContent": "sample string 4"
},
{
"FieldName": "sample string 1",
"DATA_TYPE": "sample string 2",
"IS_NULLABLE": "sample string 3",
"FieldContent": "sample string 4"
}
],
I'm sending the data in the following JSON format:
{"Last_name":"Finnigan","First_name":"Michael","Email":"MichaelF@email.com"}
{"Last_name":"Phillips","First_name":"Austin","Email":"PhillipsA@email.com"}
{"Last_name":"Collins","First_name":"Colin","Email":"ColinCollins@email.com"}
{"Last_name":"Finnigan","First_name":"Judy","Email":"Judy@email.com"}
{"Last_name":"Jones","First_name":"Julie","Email":"Julie@email.com"}
{"Last_name":"Smith","First_name":"Barry","Email":"Barry@email.com"}
{"Last_name":"Kane","First_name":"Harry","Email":"Harry@email.com"}
{"Last_name":"Smith","First_name":"John","Email":"John@email.com"}
{"Last_name":"Colins","First_name":"Ruby","Email":"RubySmith@email.com"}
{"Last_name":"Tests","First_name":"Smoke","Email":"a.n.other@pret.com"}
The code in Apache Spark is as follows:
url = 'https://enimuozygj4jqx.m.pipedream.net'
files = spark.read.json("abfss://azurestorageaccount.dfs.core.windows.net/PostContact.json")
r = requests.post(url, data=json.dumps(files))
print(r.status_code)
When I execute the code I get the following error:
TypeError: Object of type DataFrame is not JSON serializable