I want to read 5+ million events from an Azure SQL DB table and perform a BULK INSERT to Cassandra. The table has 2 columns. I see the SQL component available for reading from Azure SQL DB. https://camel.apache.org/components/3.7.x/sql-component.html
Question: Consuming from Azure SQL DB
- Is there a better way to read all the rows and store in a map considering 5M records ?
- Is there a possibility to read messages in batches ?
There is a cql component available for Cassandra https://camel.apache.org/components/3.7.x/cql-component.html
Question: Producing to Cassandra
- Can we INSERT in batches ?
Can I use camel for this use case ?