 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
ä |
Consider
the following simple algorithm
|
|
|
Input:
N
|
|
|
Output:
i, j
|
|
|
|
step
1: i = i + 2;
|
|
|
|
step
2: j = i + 3;
|
|
|
|
step
3: for (k=0;k < N; k++) j = j + 1;
|
|
|
ä |
Worst
case behavior
|
|
|
|
ä |
step
1 and 2 are O(1), step 3 is O(N)
|
|
|
|
ä |
Therefore
the algorithm is O(1 + 1 + N) =O(N)
|
|
|
ä |
We
don’t include the 1’s because they do not
|
|
|
grow
with respect to the problem size
|
|
|
|
ä |
In
general, we lump together all things that take
|
|
constant
time
|
|