Is there a way in scalding to write to a SQL table that has greater than 22 columns? The problem I am facing is as follows. I have a table which has 28 columns, each row of which I am representing using a case class. Something like
case class ResultsRow(field1: String, field2: Int, ... field28: String)
I am at the last stage, where I have a TypedPipe[ResultsRow] which I need to serialize to the DB. However, neither JDBCSource provided by scalding nor parallelai seems to support only taking tuples as input and not case classes. I did like to do something of the sort
val results: TypedPipe[ResultsRow] = getResultRows
val dbOutput: JDBCSource = JDBCSource(...)
results.write(dbOutput)
and I can't do
results
.map { row => (row.field1, row.field2, ..., row.field28) }
.write(dbOutput)
because you can't define a tuple having more than 22 fields.
Note: This prescribed on Scalding FAQ doesn't work because it is to handle reading more than 22 fields and not serializing the same to a database