Posted by: atri | December 2, 2009

## Lect 35: Dyanmic Programming

(Guest post by Shivam Kumar)

The class started with the recap of the various techniques we have learnt so far to design algorithms. Like greedy strategy reduces the exponential runtime to polynomial runtime, and divide and conquer strategy reduces larger polynomial runtime to smaller polynomial runtime ( ex. $O(n^2)$ to $O(n\log n)$.

Then the professor started talking about a new algorithm design technique called Dynamic Programming. Dynamic programming is similar to Divide and Conquer since both use recursion but Dynamic programming uses different kind of recursion. In other words, dynamic programming is smarter about solving recursive sub problems.

End of Semester Blues

﻿

If we use greedy strategy on this then we choose jobs $(1), (3)$, and $(4)$ and the total value will be $10$ [whereas the schedule that maximizes the value is picking $(5)$ to get a value of $30$. –Atri]

Formal Definition of the problem:

• Input: $n$ jobs, $i$th job $(s_i, f_i, v_i)$ which stand for start time, finish time, and value respectively.
• Output: A schedule $S\subseteq \{1,\dots,n\}$ where for $i\neq j \in S$, $i$ and $j$ are not conflicting.
• Goal: max $\sum_{i\in S} v_i$.

If $v_i = 1$ forall $i$, then it’s the interval scheduling problem.

Assume: All jobs are sorted by the finish time, i.e., $f_1\le f_2 \le \cdots \le f_n$. This will be sorted in $O(n\log n)$ time.

Previous non-conflicting jobs” ( $p(j)$ )

• $p(j)$ = largest $i$ such that $i$ does not conflict with $j$. (I.e. $f_i \le s_j$)
• $p(j) = 0$ if no such $i$ exists

Ex: compute $p$ values in linear time.

Optimal Schedule for $\{1,\dots,j\}$ will be denoted by $\mathcal{O}_j$ and its value by $OPT(j)$.

Properties of $\mathcal{O}_n$:

Case 1: $n \in \mathcal{O}_n$

Question: What can you say about jobs $p(n)+1,\dots,n-1$ wrt $\mathcal{O}_n$ ?
Answer: None of them will be in $\mathcal{O}_n$ by the definition of $p(n)$.

Claim 1: If  $n\in \mathcal{O}_n$, then $\mathcal{O}_n = (\mathcal{O}_{p(n)}, n)$.

If $p(n)$ does not conflict with $n$ then $p(n-1)$ will also not conflict.
Recall : jobs are sorted by finish time. This implies that for all $i \le p(n)$, $i$ does not conflict with $n$. This is because $f_i\le f_{p(n)} \le s_n$.

Proof of Claim 1: If not, let $\mathcal{O}_n = (\mathcal{O}', n)$ such that  value of  $\mathcal {O}’ >$ value of   $\mathcal{O}_{p(n)}$. Let $\mathcal{O}''=(\mathcal{O}_{p(n)},n)$. Now, note that

1. $\mathcal{O}''$  is a valid schedule
2. Value of  $\mathcal{O}'' <$  value  of $\mathcal{O}_n$.

Contradiction the fact that $\mathcal{O}_n$ is optimal.

Case 2:
$n$ does not belong to $\mathcal{O}_n$.

Claim 2: If $n\not\in\mathcal{O}_n$, then $\mathcal{O}_n=\mathcal{O}_{n-1}$.

Proof idea: As $n$ does not belong to $\mathcal{O}_n$  implies that $\mathcal{O}_n$ needs to schedule $1,\dots,n-1$.

For the more general case, let $1\le j\le n$ and consider the problem on jobs  $1,\dots, j$. Then by the same argument as we used to prove Claims 1 and 2, we have:

• If $j\in \mathcal{O}_j$, then  $\mathcal{O}_j = (\mathcal{O}_{p(n)}, j)$, i.e.   $OPT(j)=v_j+OPT(p(j))$.
• If $j\not\in \mathcal{O}_j$, then  $\mathcal{O}_j = \mathcal{O}_{j-1}$, i.e. $OPT(j)=OPT(j-1)$.

Thus, we have shown that

$OPT(j) = \max\{v_j + OPT(p(j)), OPT(j-1)\}$.