View
228
Download
0
Embed Size (px)
Citation preview
2
Big Oh and other notations
Introduction Classifying functions by their asymptotic
growth Theta, Little oh, Little omega Big Oh, Big Omega Rules to manipulate Big-Oh expressions Typical growth rates
3
Introduction
The work done by an algorithm, i.e. its complexity, is determined by the
number of the basic operations necessary to solve the problem.
The number of basic operations depends on the size of input
4
Basic Operations – Example 1
Problem: Find x in an array
Operation: Comparison of x with an entry in the array
Size of input: The number of the elements in the array
5
Basic Operations – Example 2
Problem: Sort an array of numbers
Operation: Comparison of two array entries plus moving elements in the array
Size of input: The number of elements in the array
6
The Task
Determine how the number of operations depend on the size of input
N - size of inputF(N) - number of operations
7
Asymptotic Growth of Functions
Asymptotic growth : The rate of growth of a function
Given a particular differentiable function f(n), all other differentiable functions fall into three classes:
growing with the same rategrowing fastergrowing slower
8
Asymptotic Growth of Functions
The rate of growth is determined by the highest order term of the function
Examples
n3 + n2 grows faster than n2 + 1000, nlog(n),...
n3 + n2 has same rate of growth as n3 , , n3 + n, ...
n3 + n2 grows slower than n4 + n, n3log(n),...
9
Theta: Same rate of growth
f(n) = (g(n)) if f(n) and g(n) are of same order
Examples: 10n3 + n2 = (n3) = (5n3 +n) =
(n3 + log(n))
Note: the coefficients can be disregarded
10
Little oh: Lower rate of growth
f(n) = o(g(n)) if f(n) has lower rate of growth, f(n) is of lower order than g(n)
Examples10n3 + n2 = (n4)10n3 + n2 = (5n5 +n) 10n3 + n2 = (n6 + log(n))
Little omega: higher rate of growth
f(n) = (g(n)) if f(n) has higher rate of growth, f(n) is of higher order than g(n)
Examples
10n3 + n2 = (n2)
10n3 + n2 = (5n2 +n)
10n3 + n2 = (n + log(n))
12
Little oh and Little omega
if g(n) = o( f(n) )
then f(n) = ω( g(n) )
Examples: Compare n and n2
n = o(n2) n2 = ω(n)
13
The Big-Oh notation
f(n) = O(g(n)) if f(n) grows with same rate or slower than g(n).
f(n) = Θ(g(n)) or f(n) = o(g(n))
14
Example
n+5 = Θ(n)n+5 = O(n)the closest estimation: n+5 =
Θ(n)the general practice is to use the Big-Oh notation: n+5 = O(n)
15
The Big-Omega notation
The inverse of Big-Oh is Ω If g(n) = O(f(n)), then f(n) = Ω (g(n))Example: n2 = O(n3), n3 = Ω(n2)
f(n) grows faster or with the same rate as g(n): f(n) = Ω (g(n))
16
Rules to manipulate Big-Oh expressions
Rule 1:a. If T1(N) = O(f(N)) and T2(N) = O(g(N)) then T1(N) + T2(N) = max( O( f (N) ), O( g(N) ) )
17
Rules to manipulate Big-Oh expressions
b. If T1(N) = O( f(N) ) and T2(N) = O( g(N) ) then
T1(N) * T2(N) = O( f(N)* g(N) )
Rules to manipulate Big-Oh expressions
Rule 2: If T(N) is a polynomial of degree k,
then T(N) = (Nk), T(N) =
O(Nk)Rule 3: (Log(N))k = O(N) for any constant k
19
Examples
O(n2 ) + O(n) = O(n2)we disregard any lower-order term
O(nlog(n)) + O(n) = O(nlog(n))
O(n2 ) + O(nlog(n)) = O(n2)
20
Examples
O(n2 ) * O(n) = O(n3)
O(nlog(n)) * O(n) = O(n2log(n))
O(n2 +n + 4) = O(n2)
(log(n)5) = O(n)
21
Typical Growth Rates
C constant, we write O(1)logN logarithmiclog2N log-squaredN linearNlogNN2 quadraticN3 cubic2N exponentialN! factorial
22
Problems
N2 = O(N2) true
2N = O(N2) true
N = O(N2) true
N2 = O(N) false
2N = O(N) true
N = O(N) true