Posted by: atri | September 21, 2010

## Clarification on Q 3 on HW 2

One of you asked about what does it mean for an algorithm to have a running time of $T(n)$? I’ll talk about this in the lecture tomorrow but I thought it would be good to give some clarification now rather than later.

We say an algorithm has running time of $T(n)$ if for every $n\ge 1$ and every input of size $n$, the algorithm runs for at most $T(n)$ steps. In other words, $T(n)$ is the maximum number of steps an algorithm can spend on any input of size $n$.

In the above, $T(n)$ is an exact count. Let us see what this means when we talk about $T(n)$ in the asymptotic sense:

• If we say $T(n)$ is $O(f(n))$, then (by the definition of Big-Oh and $T(n)$ above) this means that for some absolute constant $c>0$ and large enough $n$, for every input of size $n$, the algorithm uses up at most $c\cdot f(n)$ steps.
• If we say $T(n)$ is $\Omega(g(n))$, then this means that for some absolute constant $\epsilon>0$ and every large enough $n$, for some input of size $n$, the algorithm makes at least $\epsilon\cdot g(n)$ steps.