We are trying to develop production ready application using spring batch and spring cloud data flow server. We came across a issue when users will submit multiple files to process to scdf server at a time lets take example of 50 files each has 100 records to process. As SCDF is deployed in K8 area , it will deploy one pod for one file. K8 namespace has cpu.limit= 10 and memory limit=20GB. Lets say we have configured 1 Pod will take cpu.limit=500m and memory.limit = 1GB
So, At max only 10 pods will run at one time and for other request SCDF server will not able to process the requests.
What are the ways we have to avoid this failure? Is there any queuing mechanism in SCDF to queue the request and process them later or we must have some frontend component to handle multiple request and send file request to scdf server only when it is able to spawn other pods ?
I have set the maxcurrentTask properties to limit the number of pods launch by SCDF server. spring.cloud.dataflow.task.platform.kubernetes.accounts.default.maximum-concurrent-tasks=100
But this will not help to process other file request.