there is one case to handle more than 3M records which in the flat file, and we want to read 3k records one time and then process it to writer, is there any better way to process that? beside that, we have some other conditions, since our flat file format like below:
userid1, transaction1 userid1, transaction2 userid1, transaction3 userid2, transaction1 userid2, transaction3
one user may have many transactions, the item reader will read the transaction for one user to different threads, we want to put all the transaction for this user in the same thread, so we want to dynamically change the chunksize, so is there any better way for that?