I am migrating data from EC2 Cassandra Nodes to DataStax Astra (Premium Account) using DSBulk utility.
Command used:
dsbulk load -url folder_created_during_unload -header true -k keyspace -t table -b "secure-connect-file.zip" -u username -p password
This command gives error after a few seconds. On checking the documentation, i found that i can add --executor.maxPerSecond
in this command to limit the loading.
After this, the load command executed without any error. But if i enter a value over 15,000, the load command starts giving the error again.
Now, if a table has over 100M entries and 15,000 entries are migrated every second, it would hours and hours to complete the migration of one table. The complete database would take several days to migrate.
I want to understand what is causing this error and if there is a way to load the data at a higher speed.