(Guest post by Santosh Thapa)
Prof Atri started lecture with some recap about weighted interval schedule. Here, he discussed that all jobs are sorted by their finish time and all inputs for weighted interval schedule have non-negative value and output will be a set of schedules which is optimal and our goal is to have maximum optimal solution.
- Input: jobs, th job which stand for start time, finish time, and value respectively, sorted according to finishing time.
- Output: A schedule where for , and re not conflicting.
- Goal: max
Recall we proved last time that
We could have two option for , it could be in and it could not be in . And it can be done in linear time.
Question: How can we figure out if in optimal solution or not (given the values )?
Answer: Once we know the value of all optimal solutions, we could compare to and if the first term is greater then is in a optimal schedule.
Professor mentioned that Dynamic programming is similar to Divide and Conquer, where both use recursion but Dynamic programming uses different kind of recursion. In other words, dynamic programming is smarter about solving recursive sub problems.
To compute , he discussed about its algorithm using property of optimality, which is
- If , return
- return
To prove this algorithm he started proving Lemma.
Lemma: For every , .
Proof Idea: Induction on
Base Case: if , .
Inductive Hypothesis: For ,
Inductive Step: Since, return value is . By the IH, and , the proof follows from the recursive formula for .
Running time: Exponential
Example:
Here for each job value, it need to call recursively. So, it is exponentially recursive. But important aspect of this kind of algorithm will be helpful in while doing re-computation, which can be done very quickly by storing the recursive value. Even, though it takes more memory, but it will help in solving problem faster.
At the end of the lecture, he wrote the algorithm. We will need a global array or .
- If , return
- If , return
- Return
End of the lecture. Continue of Dec 4th.
Leave a Reply