I in the process of writting a simple java program which reads the contents of the directory and print out the names of the files and last modified time .
The issue i forsee is , the vault i am reading is pretty huge and there are some case where the files in a single directory can well exceed 20000. Using file api
`file.listFiles()`
would inturn create 20000 file objects, my concern this could slow down the process , may be bloat the memory as well.
Is there a way to batch i.e to tell java to scan the directory in terms of 50 files at a time or atleast iterate one file at a time instead of loading all objects in memory at once