I need to create a Delta Lake file containing more than 150 KPIs. Since we have 150 calculations we roughly had to create around 60 odd data frames. Finally, the individual data frames are joined as one final data frame. This final data frame has around only 60k records only. But when finally creating the "Delta" lake file it is failing with this below error.
"The Spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached"
Our cluster configuration is pretty decent and stands at 144GB of memory and has 20 cores.
Any solution to overcome this issue. Thanks in advance.