This is a challenge specific to SPARK-SQL and I'm unable to apply two highlighted answers
I'm writing complex data processing logic in SPARK-SQL.
Here is the process I follow ,
- Define case class for a table with all attributes.
- Register that as table.
- Use SQLContext to query the same.
I'm encountering an issue as Scala allows only 22 parameters whereas my table has 50 columns. Only approach I could think of is to break dataset in such a way that it has 22 parameters and combine them later at the end. It does not look like a clean approach. Is there any better approach to this issue ?