So, I have a question. Suppose there are k processors and k*n Jobs. Now, each processor ought to do exactly n jobs, and every processor takes a particular amount of time to do a particular job. As shown below, here the column Time k refers to the time taken by processor k to do different jobs.
The task is to find the minimum time with which we can do all the jobs. Now, for k=2, I found that a greedy approach works. We can just calculate the T2-T1 for each job. This is the net loss of time if we execute the tasks in processor 2. Thus, we can then sort this in non-decreasing order and allocate the first n jobs to processor 2 and rest processor 1. I am not able to extend the solution further for higher values of k. Is there a general algorithm for this? I would appreciate it if anyone can point to any papers for this, or guide me about the general direction. I need a exact solution though, but close enough solutions work too.
Edit: The machines run one after the another and not simultaneously. Thus, the completion time is not the minimum of time taken by all the machines, rather than it's the sum of their times.