I have N tasks where the i'th task takes A[i]
time to process. Every task is independent of each other and can be scheduled anytime on any of the P processors. A task can be run on only 1 processor, and a processor can process any number of tasks. Each agent/processor can only work on one task at a time, and once begun, must continue to work on it until the task is complete
I want to minimize the amount of time it takes to complete all the task
I am implementing this using a min-heap, i.e.
- Sort the task in descending order
- Create a min-heap of size P initialized to 0
- For each task i, pull the min from heap, add the task time
A[i]
to it and add it back to the heap
The time to complete all the task is maximum value in the heap. This has been working so far and I want to verify its correctness
Do you think this breaks for any inputs? I believe I am doing something like Greedy Number Partitioning
CodePudding user response:
This is a polynomial time algorithm for a problem that includes NP complete problems as special cases (for example with P=2, you have a subset sum problem). Therefore you should expect it to not always work.
The simplest case I could find where your algorithm breaks is if the weights are 1, 1, 5, 5
and P=2
. Your algorithm should combine things like this:
1 1 5 5
1,1 5 5
1,1,5 5
and will take 7. The better solution you don't find is:
1,5 1,5
which will complete in 6.