I've a nested dynamic json. I want to load data from the json to some 5 hive tables. The tables do have some complex datatypes. A single json is used to populate all the tables. I flattened out the entire json dynamically. But that won't suffice. I need to enter in some of the columns as nested only. How do I insert a nested pyspark dataframe column to hive?
I'm unable to figure out the insertion of a nested dataframe column to hive nested column( same structure).