Time Complexity
Part 2
Instructors
Farhat Jalil
 Slide courtesy
Asymptotic Complexity
 The 5N+3 time bound is said to "grow asymptotically"
like N
 This gives us an approximation of the complexity of
the algorithm
 Ignores lots of (machine dependent) details,
concentrate on the bigger picture
Comparing Functions: Asymptotic Notation
 Big Oh Notation: Upper bound
 Omega Notation: Lower bound
 Theta Notation: Tighter bound
Big Oh Notation
 To denote asymptotic upper bound, we use O-
notation.
 For a given function g(n), we denote
by O(g(n)) (pronounced “big-oh of g of n”) the set of
functions:
 O(g(n))= { f(n) : there exist positive constants c and n0 such
that 0 f(n) c g(n)
≤ ≤ ∗ for all n n0
≥ }
Omega Notation
 To denote asymptotic lower bound, we use Ω-
notation.
 For a given function g(n), we denote
by Ω(g(n)) (pronounced “big-omega of g of n”) the set
of functions:
 Ω(g(n))= { f(n) : there exist positive constants c and n0 such
that 0 c g(n) f(n)
≤ ∗ ≤ for all n n0
≥ }
Theta Notation
 To denote asymptotic tight bound, we use Θ-notation.
 For a given function g(n), we denote
by Θ(g(n)) (pronounced “big-theta of g of n”) the set of
functions:
 Θ(g(n))= { f(n) : there exist positive
constants c1,c2 and n0 such
that 0 c1 g(n) f(n) c2 g(n)
≤ ∗ ≤ ≤ ∗ for all n>n0 }
Example : Comparing Functions
 Which function is better?
10 n2
Vs n3
0
500
1000
1500
2000
2500
3000
3500
4000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
10 n^2
n^3
Comparing Functions
 As inputs get larger, any algorithm of a smaller order
will be more efficient than an algorithm of a larger
order
Big-Oh Notation
 Even though it is correct to say “7n - 3 is O(n3
)”, a better
statement is “7n - 3 is O(n)”, that is, one should make
the approximation as tight as possible
 Simple Rule:
Drop lower order terms and constant factors
7n-3 is O(n)
8n2
log n + 5n2
+ n is O(n2
log n)

Time Complexity of an algorithm/code - Part 2.pptx

  • 1.
  • 2.
  • 3.
    Asymptotic Complexity  The5N+3 time bound is said to "grow asymptotically" like N  This gives us an approximation of the complexity of the algorithm  Ignores lots of (machine dependent) details, concentrate on the bigger picture
  • 4.
    Comparing Functions: AsymptoticNotation  Big Oh Notation: Upper bound  Omega Notation: Lower bound  Theta Notation: Tighter bound
  • 5.
    Big Oh Notation To denote asymptotic upper bound, we use O- notation.  For a given function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n”) the set of functions:  O(g(n))= { f(n) : there exist positive constants c and n0 such that 0 f(n) c g(n) ≤ ≤ ∗ for all n n0 ≥ }
  • 6.
    Omega Notation  Todenote asymptotic lower bound, we use Ω- notation.  For a given function g(n), we denote by Ω(g(n)) (pronounced “big-omega of g of n”) the set of functions:  Ω(g(n))= { f(n) : there exist positive constants c and n0 such that 0 c g(n) f(n) ≤ ∗ ≤ for all n n0 ≥ }
  • 7.
    Theta Notation  Todenote asymptotic tight bound, we use Θ-notation.  For a given function g(n), we denote by Θ(g(n)) (pronounced “big-theta of g of n”) the set of functions:  Θ(g(n))= { f(n) : there exist positive constants c1,c2 and n0 such that 0 c1 g(n) f(n) c2 g(n) ≤ ∗ ≤ ≤ ∗ for all n>n0 }
  • 8.
    Example : ComparingFunctions  Which function is better? 10 n2 Vs n3 0 500 1000 1500 2000 2500 3000 3500 4000 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 10 n^2 n^3
  • 9.
    Comparing Functions  Asinputs get larger, any algorithm of a smaller order will be more efficient than an algorithm of a larger order
  • 10.
    Big-Oh Notation  Eventhough it is correct to say “7n - 3 is O(n3 )”, a better statement is “7n - 3 is O(n)”, that is, one should make the approximation as tight as possible  Simple Rule: Drop lower order terms and constant factors 7n-3 is O(n) 8n2 log n + 5n2 + n is O(n2 log n)