Time Complexity of an algorithm/code - Part 2.pptx
The Time Complexity of an algorithm/code is not equal to the actual time required to execute a particular code, but the number of times a statement executes.
Asymptotic Complexity
The5N+3 time bound is said to "grow asymptotically"
like N
This gives us an approximation of the complexity of
the algorithm
Ignores lots of (machine dependent) details,
concentrate on the bigger picture
Big Oh Notation
To denote asymptotic upper bound, we use O-
notation.
For a given function g(n), we denote
by O(g(n)) (pronounced “big-oh of g of n”) the set of
functions:
O(g(n))= { f(n) : there exist positive constants c and n0 such
that 0 f(n) c g(n)
≤ ≤ ∗ for all n n0
≥ }
6.
Omega Notation
Todenote asymptotic lower bound, we use Ω-
notation.
For a given function g(n), we denote
by Ω(g(n)) (pronounced “big-omega of g of n”) the set
of functions:
Ω(g(n))= { f(n) : there exist positive constants c and n0 such
that 0 c g(n) f(n)
≤ ∗ ≤ for all n n0
≥ }
7.
Theta Notation
Todenote asymptotic tight bound, we use Θ-notation.
For a given function g(n), we denote
by Θ(g(n)) (pronounced “big-theta of g of n”) the set of
functions:
Θ(g(n))= { f(n) : there exist positive
constants c1,c2 and n0 such
that 0 c1 g(n) f(n) c2 g(n)
≤ ∗ ≤ ≤ ∗ for all n>n0 }
8.
Example : ComparingFunctions
Which function is better?
10 n2
Vs n3
0
500
1000
1500
2000
2500
3000
3500
4000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
10 n^2
n^3
9.
Comparing Functions
Asinputs get larger, any algorithm of a smaller order
will be more efficient than an algorithm of a larger
order
10.
Big-Oh Notation
Eventhough it is correct to say “7n - 3 is O(n3
)”, a better
statement is “7n - 3 is O(n)”, that is, one should make
the approximation as tight as possible
Simple Rule:
Drop lower order terms and constant factors
7n-3 is O(n)
8n2
log n + 5n2
+ n is O(n2
log n)