Let's say we're given N jobs and K workers to do those jobs. But for some jobs we need 2 employees, while for some we need just one. Also the employees can't do all jobs. For example worker 1 can do jobs 1,2 and 5, while not jobs 3 and 4. Also if we hire worker 1 to do job 1, then we want him to do jobs 2 and 5, since we've already paid him.
So for example let's say we have 5 jobs and 6 workers. For jobs 1,2 and 4 we need 2 men, while for jobs 3 and 5 we need just one. And here's the list of the jobs every worker can do and the wage he requires.
Worker 1 can do jobs 1,3,5 and he requires 1000 dollars.
Worker 2 can do jobs 1,5 and he requires 2000 dollars.
Worker 3 can do jobs 1,2 and he requires 1500 dollars.
Worker 4 can do jobs 2,4 and he requires 2500 dollars.
Worker 5 can do jobs 4,5 and he requires 1500 dollars.
Worker 6 can do jobs 3,5 and he requires 1000 dollars.
After little calculation and logical thinking we can conclude that we have to hire workers 1,3,4 and 5, which means that the minimum wage we need to pay is: 1000+1500+2500+1500=5500 dollars.
But how we can find an efficient algorithm that will output that amount? This somehow reminds me of the Hungarian Algorithm, but all those additional constrains makes it impossible for me to apply it.