I am using ScalikeJDBC to fetch a large table, convert the data to JSON and then calling a web service with 50 JSON objects (rows) each. This is my code:
val rows = sql"SELECT * FROM bigtable"
val jsons = rows.map { row =>
// build JSON object for each row
}.toList().apply()
jsons.grouped(50).foreach { batch =>
// send 50 objects at once to an HTTP server
}
This works, but unfortunately, the intermediate list is huge and consumes alot of memory. I am looking for a way to iterate over the resultset in a "lazy" fashion, similar to foreach
, except I want to iterate over batches of 50 rows. Is that possible with ScalikeJDBC?
I solved the memory issues by filling and clearing a mutable list instead of using grouped
, but I am still looking for a better solution.