Delta Lake tables follow the "WRITE ONCE READ MANY(WORM)" concept which means the partitions are immutable. This makes sense and usually the approach most of the other datawarehouse products also take. This approach, however has write explosion. Every time, I update an existing record, the entire partition of the record is copied and than updated. So, definitely insert one record at a time is not a good option.
So, my question is, what is recommended batch size for loading Delta lake tables?