I am rather new to spark, and I wonder what is the best practice when using spark-streaming
with Cassandra.
Usually, when performing IO, it is a good practice to execute it inside a Future
(in Scala).
However, a lot of the spark-cassandra-connector
seems to operate synchronously.
For example: saveToCassandra
(com.datastax.spark.connector.RDDFunctions
)
Is there a good reason why those functions are not async ?
should I wrap them with a Future
?