algorithm course dr. aref rashad february 20131 algorithms course..... dr. aref rashad
Post on 29-Dec-2015
228 Views
Preview:
TRANSCRIPT
Algorithm Course
Dr Aref Rashad
February 2013 1Algorithms Course Dr Aref Rashad
Course Objectives
bull Algorithm Definition and Developmentbull Algorithm Complexitybull Asymptotic Analysis of algorithmsbull Classification of Algorithmsbull Techniques for Algorithm Developmentsbull Development and Evaluation of Algorithms for basic
processes as Sorting Searching hellip etc
February 2013 2Algorithms Course Dr Aref Rashad
February 2013 3Algorithms Course Dr Aref Rashad
Algorithms and Programs
bull Algorithm a method or a process followed to solve a problemndash A recipe
bull An algorithm takes the input to a problem (function) and transforms it to the outputndash A mapping of input to output
bull A problem can have many algorithms that may differ dramatically in concept speed and space requirements
February 2013 4
Problem
Algorithm
Computer OutputInput
Algorithms Course Dr Aref Rashad
Algorithm Properties
bull An algorithm possesses the following propertiesndash It must be correctndash It must be composed of a series of concrete stepsndash There can be no ambiguity as to which step will be
performed nextndash It must be composed of a finite number of stepsndash It must terminate
bull A computer program is an instance or concrete representation for an algorithm in some programming language
February 2013 5Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Course Objectives
bull Algorithm Definition and Developmentbull Algorithm Complexitybull Asymptotic Analysis of algorithmsbull Classification of Algorithmsbull Techniques for Algorithm Developmentsbull Development and Evaluation of Algorithms for basic
processes as Sorting Searching hellip etc
February 2013 2Algorithms Course Dr Aref Rashad
February 2013 3Algorithms Course Dr Aref Rashad
Algorithms and Programs
bull Algorithm a method or a process followed to solve a problemndash A recipe
bull An algorithm takes the input to a problem (function) and transforms it to the outputndash A mapping of input to output
bull A problem can have many algorithms that may differ dramatically in concept speed and space requirements
February 2013 4
Problem
Algorithm
Computer OutputInput
Algorithms Course Dr Aref Rashad
Algorithm Properties
bull An algorithm possesses the following propertiesndash It must be correctndash It must be composed of a series of concrete stepsndash There can be no ambiguity as to which step will be
performed nextndash It must be composed of a finite number of stepsndash It must terminate
bull A computer program is an instance or concrete representation for an algorithm in some programming language
February 2013 5Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 3Algorithms Course Dr Aref Rashad
Algorithms and Programs
bull Algorithm a method or a process followed to solve a problemndash A recipe
bull An algorithm takes the input to a problem (function) and transforms it to the outputndash A mapping of input to output
bull A problem can have many algorithms that may differ dramatically in concept speed and space requirements
February 2013 4
Problem
Algorithm
Computer OutputInput
Algorithms Course Dr Aref Rashad
Algorithm Properties
bull An algorithm possesses the following propertiesndash It must be correctndash It must be composed of a series of concrete stepsndash There can be no ambiguity as to which step will be
performed nextndash It must be composed of a finite number of stepsndash It must terminate
bull A computer program is an instance or concrete representation for an algorithm in some programming language
February 2013 5Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Algorithms and Programs
bull Algorithm a method or a process followed to solve a problemndash A recipe
bull An algorithm takes the input to a problem (function) and transforms it to the outputndash A mapping of input to output
bull A problem can have many algorithms that may differ dramatically in concept speed and space requirements
February 2013 4
Problem
Algorithm
Computer OutputInput
Algorithms Course Dr Aref Rashad
Algorithm Properties
bull An algorithm possesses the following propertiesndash It must be correctndash It must be composed of a series of concrete stepsndash There can be no ambiguity as to which step will be
performed nextndash It must be composed of a finite number of stepsndash It must terminate
bull A computer program is an instance or concrete representation for an algorithm in some programming language
February 2013 5Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Algorithm Properties
bull An algorithm possesses the following propertiesndash It must be correctndash It must be composed of a series of concrete stepsndash There can be no ambiguity as to which step will be
performed nextndash It must be composed of a finite number of stepsndash It must terminate
bull A computer program is an instance or concrete representation for an algorithm in some programming language
February 2013 5Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Example Polynomial Evaluation
Suppose that exponentiation is carried out using multiplications Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x + 6are
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
February 2013 6Algorithms Course Dr Aref Rashad
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Algorithm Specification
Pseudocode Conventions (English like Statements)
- Comments included - Data types are not explicitly declared- Logical operators and or and not can be used- Relational operators can be used ltgt=hellipet- Arrays can be used eg A(ij)- Looping statements are employed for while and
repeat-until- Conditional statements can be used If then
February 2013 7Algorithms Course Dr Aref Rashad
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
1 Initialize an integer Sum to zero2 For all array values Increase Sum by array value3 Go to step 2
February 2013 8Algorithms Course Dr Aref Rashad
Pesudo CodeAlgorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Problem Array Sum
A Solution For an array A with length n sum all the array elements in a new integer
An initial Algorithm
Translating a Problem into an Algorithm
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
1 Initialize integer C to zero2 If A is zero wersquore done and C Contains the result Otherwise proceed to step 33 Add the value of B to C4 Decrement A5 Go to step 2
Pseudocode Function Multiply(Integer A Integer B)Integer C = 0While A is greater than 0C = C + BA = A - 1EndReturn CEnd
February 2013 9Algorithms Course Dr Aref Rashad
Translating a Problem into an AlgorithmProblem Integers Multiplication
A Solution Given any two integers A and B we can say that multiplying A times B involves adding B to itself A times An initial Algorithm
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Translating a Problem into an Algorithm
February 2013 10
Problem Sort a collection of ngt=1 elements of arbitrary type
A Solution From those elements that are currently unsorted find the smallest and place it next in the sorted listAn initial AlgorithmFor i=1 to n do Examine a(i) to a(n) and suppose the smallest element is at a(j) Interchange a(i) and a(j)
Pseudocode SelectionSort (an) For i=1 to n do j=I For k=i+1 to n do if ( a(k) lt a(j) ) then j=k t=a(i) a(i)= a(j) a(j)=t
Algorithms Course Dr Aref Rashad
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Analysis of algorithms
The theoretical study of computer-program performance and resource usage
Whatrsquos more important than performanceModularity CorrectnessMaintainability FunctionalityRobustness user-friendlinessProgrammer time SimplicityExtensibility Reliability
February 2013 11Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 12
Functionality The degree to which the software satisfies stated needs (suitability accuracy inter-operability compliance and security)
Reliability The amount of time that the software is available for use (maturity fault tolerance and recoverability)
Usability The degree to which the software is easy to use (understandability learnability and operability)
Analysis of algorithms
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 13
Efficiency The degree to which the software makes optimal use of system resources (time behavior resource behavior)
Maintainability The ease with which repair may be made to the software (analyzability changeability stability and testability)
Portability The ease with which the software can be transposed from one environment to another (adaptability replaceability)
Analysis of algorithms
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 14
Why study algorithms and performance
Algorithms help us to understand scalability
Performance often draws the line between what is feasible and what is impossible
Algorithmic mathematics provides a language for talking about program behavior
Performance is the currency of computing
Help to choose between different Algorithms to solve a problem
Algorithms Course Dr Aref Rashad
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Comparing Algorithms
Empirical Approach
bull (1) Implement each candidate ndash That could be lots of work ndash also error-prone
bull (2) Run it ndash Which inputs Test data
bull (3) Time itndash What machines OS
February 2013 15
How to solve ldquowhich algorithmrdquo problems without machines or test data
Algorithms Course Dr Aref Rashad
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Analytical Approach
February 2013 16
Problem
Algorithm 1
Algorithm 2hellip
Algorithm n
solve Efficiency
Grow proportionally with the amount of data
Algorithms Course Dr Aref Rashad
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 17
Algorithm 1
Algorithm 2hellip
Algorithm n
Efficiency
Compare
Computational Complexity
Measure the degree of difficulty of an algorithm
Analytical Approach
Algorithms Course Dr Aref Rashad
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 18
Computational Complexity
How much effort
How costly
Various ways of measuring
Our concern efficiency criteria of time and space
FocusHow can we measure time and space
Analytical Approach
Algorithms Course Dr Aref Rashad
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Space Complexity
Measure of an algorithmrsquos memory requirements during runtime
bull Data structures bull Temporary variables
February 2013 19
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Space needed equal to the sum of- Fixed part independent of the Characteristics of Inputs and
Outputs (Instruction space space for variables and constants hellipetc
- Variable part dependent on the problem instances
Algorithms Course Dr Aref Rashad
Focus on estimating Variable part which depends on number and magnitude of Inputs and Outputs
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
The Sum of Compile time and Run timeCompile time doesnrsquot depend on IO Characteristics
February 2013 20
Time Complexity
bull The running time depends on the input
bull Parameterize the running time by the size of the input
bull Generally we seek upper bounds on the running time because everybody likes a guarantee
Algorithms Course Dr Aref Rashad
Focus only on Run time
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Exact Formula Summation of time needed for all operations (eg add subtract multiplyhellip) helliphelliphelliphelliphelliphelliphellip Impossible task
Experimental Approach Type compile and run on a specific machinehellip Time may differ on multiuser system
February 2013 21
Time Complexity How to measure
Algorithms Course Dr Aref Rashad
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Comments helliphelliphellip 0 StepAssignment helliphelliphellip 1 StepLoops helliphelliphelliphelliphelliphelliphellip No of steps account in control part
First Method Introduce a new variable Count into the Program
Second Method Build a table to list total No of steps contributed by each statement
February 2013 22
Time Complexity
Algorithms Course Dr Aref Rashad
Approximate Approach
Count Program steps (segment independent of Problem Characteristics) Determining No of Steps
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 23
Algorithm ArraySum(an) Count=-0 Sum=0 Count=Count+1 For i=1 to n do Count=Count+1helliphelliphelliphelliphelliphelliphellip for For Sum=Sum + a(i) Count=Count+1hellipfor Assignment Count=Count+1helliphelliphelliphelliphelliphelliphellip for last time of For Count=Count+1helliphelliphelliphelliphelliphelliphellip For return return SumTotal number of steps 2n+3
Algorithm ArraySum(an) Sum=0 For i=1 to n do Sum=Sum + a(i) return Sum
Time ComplexityFirst Method Introduce a new variable Count into the Program
Algorithms Course Dr Aref Rashad
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Second MethodBuild a table to list total No of steps contributed by each statementDetermine the number of steps per execution (se) of the statement and the total number of times (ie frequency) each statement is executed
Algorithm ArraySum(an) se frequency total steps Sum=0 1 1 1 For i=1 to n do 1 n+1 n+1 Sum=Sum + a(i) 1 n n return Sum 1 1 1
Total 2n+3
T(n) = 2n+3February 2013 24
Time Complexity
Algorithms Course Dr Aref Rashad
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Example Polynomial Evaluation
Two ways to evaluate the polynomialp(x) = 4x4+ 7x3ndash2x2+ 3x+ 6
Brute force methodp(x) = 4xxxx + 7xxx ndash2xx + 3x + 6
Hornerrsquos methodp(x) = (((4x + 7) x ndash2) x + 3) x + 6
General form of Polynomialp(x) = a
1+ a2x1+ helliphelliphellip + an+1xn
where an is non-zero for all n gt= 0
February 2013 25Algorithms Course Dr Aref Rashad
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 26
T(n) = n22 + n2
Example Polynomial Evaluation
Pseudocode Brute force method
1 Input a1hellip an+1 and x2 Initialize poly = a1
3 for i = 1hellip n poly = poly + ai+1 x x hellip x (i factors x) end of for loop4 Output poly
Algorithms Course Dr Aref Rashad
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 27
Example Polynomial Evaluation
Pseudocode Hornerrsquos method
1 Input a1hellip an+1 and x2 Initialize poly = an+1
3 for i = 1hellip n poly = poly x+ an-i+1 end of for loop4 Output poly
T(n) = 2n
Algorithms Course Dr Aref Rashad
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 28
Example Polynomial Evaluation
Algorithms Course Dr Aref Rashad
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Time Complexity is not dependent solely on the number of inputs and outputs Three kinds of step accountBest case worst case and average case
Ex Sequential search for K in an array of n integers
Begin at first element in array and look at each element in turn until K is found
Best case The first position of the array has KWorst case The last position in the array has KAverage case n2
February 2013 29
Time ComplexityBest case worst case and average case
Algorithms Course Dr Aref Rashad
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
The Best Case Normally we are not interested in the best case because It is too optimistic Not a fair characterization of the algorithmsrsquo running time Useful in some rare cases where the best case has high probability of occurring
The average case Often we prefer to know the average-case running time
The average case reveals the typical behavior of the algorithm on inputs of size nAverage case estimation is not always possible
February 2013 30Algorithms Course Dr Aref Rashad
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 31
The Worst Case Useful in many real-time applicationsAlgorithm must perform at least that wellMight not be a representative measure of the behavior of the algorithm on inputs of size n
NoteIf we know enough about the distribution of our input we prefer the average-case analysis
If we do not know the distribution then we must resort to worst-case analysis
Algorithms Course Dr Aref Rashad
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
1000log100 102 nnnnf
February 2013 32
Growth Rate of the Complexity time function
Algorithms Course Dr Aref Rashad
Upper bound Concept
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
worst-case timebullIt depends on the speed of our computerbullrelative speed (on the same machine)bullabsolute speed (on different machines)
BIG IDEAbullIgnore machine-dependent constantsbullLook at growth of T(n)as nrarrinfin
February 2013 33
Time Complexity
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
n large
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 34
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm
T(n) = 2n+3 g(n) = 3n
O(n)
T(n) = 4n+8 g(n) =5n
T(n) = n2+8 g(n) =n2 O(n2)
Upper boundNo of Steps n large
O(n3)
O(log n)
Algorithms Course Dr Aref Rashad
ldquoAsymptotic Analysisrdquo
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Growth Rates of the Time Complexities of Algorithms with respect to increasing problem sizes On a machine running at 1 GHz
February 2013 35Algorithms Course Dr Aref Rashad
Time Complexity
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 36
Algorithm Matters
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
bull Big-oh notation indicates an upper bound
bull How bad things can get ndash perhaps things are not nearly badbull Lowest possible upper bound
bull Asymptotic analysis is a useful tool to help to structure our thinking when n gets large enough
bull Example linear search n2 is an upper bound but n is the tightest upper bound
February 2013 37Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
Complexity Analysis
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Calculating Running Time T(n)
bull Use the sourcepseudo codebull Ignore constantsbull Ignore lower order termsbull Explicitly assume either
ndash The average case (harder to do)ndash The worst case (easier)
bull Most analysis uses the worst case
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 39
Linear-time Loop
for x = 1 to n constant-time operation
Note Constant-time mean independent of the input size
n
x
n1
111111
2-Nested Loops Quadratic
for x = 0 to n-1for y = 0 to n-1
1
0
1
0
1n
x
n
y
1
0
n
x
n nnn 2nnn
3-Nested Loops Cubic
for x = 0 to n-1for y = 0 to n-1
for z = 0 to n-1constant-time
operationf(n) = n3 helliphellip The number of nested loops determines the exponent
constant-time operation
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 40
Add independent loops
for x = 0 to n-1constant-time op
for y = 0 to n-1for z = 0 to n-1
constant-time op
for w = 0 to n-1constant-time op
f(n) = n + n2 + n
= n2 + 2n
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 41
Non-trivial loops
for x = 1 to nfor y = 1 to x
constant-time operation
n
z
z
y
y
x1 1 1
1
n
x
x1 2
)1(
nn 22
2n
nn
for z = 1 to nfor y = 1 to z
for x = 1 to yconstant-
op
323
6
23n
nnn
n
x
x
y1 1
1
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
EG 3x 3 + 5x 2 ndash 9 = O (x 3)Doesnrsquot mean
ldquo3x 3 + 5x 2 ndash 9 equals the function O (x 3)rdquo Which actually means
ldquo3x 3+5x 2 ndash9 is dominated by x 3rdquoRead as ldquo3x 3+5x 2 ndash9 is big-Oh of x 3rdquo
February 2013 42
The ldquoBig-Ohrdquo Notation
In fact 3x3+5x2 ndash9 is smaller than 5x3 for large enough values of x
Algorithms Course Dr Aref Rashad
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
O-notation (upper bounds)
February 2013 43
So g(n) is an asymptotic upper-bound for f(n) as n increases(g(n) bounds f(n) from above)
Algorithms Course Dr Aref Rashad
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
f(n) is O(g(n)) if f grows at most as fast as g
February 2013 44
cg(n) is an approximation to f(n) bounding from above
Algorithms Course Dr Aref Rashad
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Big-Oh Rules
If is f(n) a polynomial of degree d then f(n) is O(nd) ie Drop lower-order terms Drop constant factors
Use the smallest possible class of functions Say ldquo2n is O(n)rdquo instead of ldquo2n is O(n2)rdquo
Use the simplest expression of the class Say ldquo3n + 5 is O(n)rdquo instead of ldquo3n + 5 is O(3n)rdquo
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Example 1 If T(n) = 3n2 then T(n) is in O(n2)
Example 2 T(n) = c1n2 + c2n in average case
c1n2 + c2n lt= c1n2 + c2n2 lt= (c1 + c2)n2 for all n gt 1
T(n) lt= cn2 for c = c1 + c2 and n0 = 1
Therefore T(n) is in O(n2) by the definition
Example 3 T(n) = c We say this is in O(1)
February 2013 46Algorithms Course Dr Aref Rashad
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 47
27 n )(nO
By definition we need to find bull a real constant c gt 0bull an integer constant n0 gt= 1
Such that 7n - 2 lt= c n for every integer n gt= 0
Possible choice isbull c = 7bull n = 17n - 2 lt= 7n for n gt= 1
Algorithms Course Dr Aref Rashad
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 48Algorithms Course Dr Aref Rashad
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 49
5log1020 3 nnn )( 3nO
1355log1020 33 nfornnnn
nn logloglog3 )(lognO
2log4logloglog3 nfornnn
1002 )1(O 1122 100100 nfor
n5 )1( nO 1)1(55 nfornn
Algorithms Course Dr Aref Rashad
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Set definition of O-notation
February 2013 50Algorithms Course Dr Aref Rashad
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Meaning For all data sets big enough (ie n gt n0) the algorithm always executes in more than cg(n) steps
February 2013 51Algorithms Course Dr Aref Rashad
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
T(n) = c1n2 + c2n
c1n2 + c2n gt= c1n2 for all n gt 1T(n) gt= cn2 for c = c1 and n0 = 1
Therefore T(n) is in (n2) by the definition
We want the greatest lower bound
February 2013 52Algorithms Course Dr Aref Rashad
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 53Algorithms Course Dr Aref Rashad
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 54
When big-Oh and meet we indicate this by using (big-Theta) notation
Definition An algorithm is said to be (h(n)) if it is in O(h(n)) and it is in (h(n))
Algorithms Course Dr Aref Rashad
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 55Algorithms Course Dr Aref Rashad
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Relations Between Q O W
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Intuition for Asymptotic Notation
Big-Oh f(n) is O(g(n)) if f(n) is asymptotically less than or
equal to g(n)
big-Omega f(n) is (g(n)) if f(n) is asymptotically greater than
or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 58Algorithms Course Dr Aref Rashad
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 59Algorithms Course Dr Aref Rashad
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Constant Time O(1)
O(1) actually means that an algorithm takes constant time to run in other words performance isnrsquot affected by the size of the problem
Linear Time O(N)
An algorithm runs in O(N) if the number of operations required to perform a function is directly proportional to the number of items being processed Example waiting line at a supermarket
February 2013 60Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 61
Quadratic Time O(N2)
An algorithm runs in O(N2) if the number of operations required to perform a function is directly proportional to the quadratic number of items being processed Example Group shakehand
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Logarithmic Time O(log N) and O(N log N)
The running time of a logarithmic algorithm increases with the log of the problem size When the size of the input data set increases by a factor of a million the run time will only increase by some factor of log(1000000) = 6
February 2013 62Algorithms Course Dr Aref Rashad
Factorial Time O(N)It is far worse than even O(N2) and O(N3)Itrsquos fairly unusual to encounter functions with this kind of behavior
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 63Algorithms Course Dr Aref Rashad
The ldquoBig-Ohrdquo Notation
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 64Algorithms Course Dr Aref Rashad
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Practical ConsiderationsNo such big difference in running time between Θ1(n) and
Θ2(nlogn)
bull Θ1(10000) = 10000
bull Θ2(10000) =10000 log1010000 = 40000
February 2013 65Algorithms Course Dr Aref Rashad
There is an enormous difference between Θ1(n2) and
Θ2(nlogn)
bull Θ1(10000) = 100000000
bull Θ2(10000) = 10000 log10 10000 = 40000
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Remarksbull Most statements in a program do not have much effect on the
running time of that program
bull There is little point to cutting in half the running time of a subroutine that accounts for only 1 of the total
bull Focus your attention on the parts of the program that have the most impact
February 2013 66Algorithms Course Dr Aref Rashad
bull The greatest time and space improvements come from a better data structure or algorithm
bull ldquoFIRST TUNE THE ALGORITHM THEN TUNE THE CODErdquo
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Remarksbull When tuning code it is important to gather
good timing statisticsbull Be careful not to use tricks that make the
program unreadablebull Make use of compiler optimizationsbull Check that your optimizations really improve
the program
February 2013 67Algorithms Course Dr Aref Rashad
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
An algorithm with time equation T(n) = 2n2 does not receive nearly as great an improvement from the faster machine as an algorithm with linear growth rate
Instead of an improvement by a factor of ten the improvement is only the square root of 10 (asymp316)Instead of buying a faster computer consider what happens if you replace an algorithm with quadratic running time with a new algorithm with n logn running time
February 2013 68
Remarks
Algorithms Course Dr Aref Rashad
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 Algorithms Course Dr Aref Rashad 69
Lecture 1 End
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Comparison of Functions
f g a b
f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b
f (n) = o(g(n)) a lt b
f (n) = w (g(n)) a gt b
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Limits
lim [f (n) g (n)] = 0 THORN f (n) Icirc o (g (n)) n
lim [f (n) g (n)] lt THORN f (n) Icirc O(g (n)) n
0 lt lim [f (n) g (n)] lt THORN f (n) Icirc Q(g (n)) n
0 lt lim [f (n) g (n)] THORN f (n) Icirc (g (n)) n
lim [f (n) g (n)] = THORN f (n) Icirc w (g (n)) n
lim [f (n) g (n)] undefined THORN canrsquot say
n
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Logarithms
x = logba is the exponent for a = bx
Natural log ln a = logea
Binary log lg a = log2a
lg2a = (lg a)2
lg lg a = lg (lg a)ac
ab
bb
c
cb
bn
b
ccc
a
bb
b
ca
ba
aa
b
aa
ana
baab
ba
loglog
log
log
1log
log)1(log
log
loglog
loglog
loglog)(log
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Logarithms and exponentials
If the base of a logarithm is changed from one constant to another the value is altered by a constant factor Ex log10 n log210 = log2 n
Base of logarithm is not an issue in asymptotic notation
Exponentials with different bases differ by a exponential factor (not a constant factor) Ex 2n = (23)n3n
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
ExamplesExpress functions in A in asymptotic notation using functions in B
A B
5n2 + 100n 3n2 + 2
A (n2) n2 (B) A (B)
log3(n2) log2(n3)
logba = logca logcb A = 2lgn lg3 B = 3lgn AB =2(3lg3)
nlg4 3lg n
alog b = blog a B =3lg n=nlg 3 AB =nlg(43) as n lg2n n12
lim ( lga n nb ) = 0 (here a = 2 and b = 12) A o (B) n
A (B)
A (B)
A (B)
A o (B)
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Math Basics Constant Series For integers a and b a b
Linear Series (Arithmetic Series) For n 0
Quadratic Series For n 0
b
ai
ab 11
2)1(
211
nnni
n
i
n
i
nnnni
1
2222
6)12)(1(
21
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Cubic Series For n 0
Geometric Series For real x 1
For |x| lt 1
n
i
nnni
1
223333
4)1(
21
n
k
nnk
xx
xxxx0
12
11
1
0 11
k
k
xx
Math Basics
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Linear-Geometric Series For n 0 real c 1
Harmonic Series nth harmonic number nIcircI+
n
i
nnni
ccnccn
ncccic1
2
12
)1()1(
2
nH n
131
21
1
n
k
Onk1
)1()ln(1
Math Basics
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Telescoping Series
Differentiating Series For |x| lt 1
n
knkk aaaa
101
021k
k
x
xkx
Math Basics
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Many problems whose obvious solution requires Θ(n2) time also has a solution that requires Θ(nlogn) Examplesndash Sortingndash Searching
February 2013 79Algorithms Course Dr Aref Rashad
Practical Considerations
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Remarks
bull Comparative timing of programs is a difficult businessndash Experimental errors from uncontrolled factors
(system load language compiler etc)ndash Bias towards a programndash Unequal code tuning
February 2013 80Algorithms Course Dr Aref Rashad
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 81Algorithms Course Dr Aref Rashad
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 82Algorithms Course Dr Aref Rashad
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 83Algorithms Course Dr Aref Rashad
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 84Algorithms Course Dr Aref Rashad
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 85Algorithms Course Dr Aref Rashad
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 86Algorithms Course Dr Aref Rashad
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 87Algorithms Course Dr Aref Rashad
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 88Algorithms Course Dr Aref Rashad
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
Faster Computer or Algorithm
What happens when we buy a computer 10 times faster
T(n) n nrsquo Change nrsquon
10n 1000
10000nrsquo = 10n 10
20n 500 5000 nrsquo = 10n 10
5n log n
250 1842 10 n lt nrsquo lt 10n
737
2n2 70 223 nrsquo = 10n 316
2n 13 16 nrsquo = n + 3 -----
February 2013 89Algorithms Course Dr Aref Rashad
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 90
Implications of Dominance Exponential algorithms get hopeless fast Quadratic algorithms get hopeless at or
before 1000000 O(n log n) is possible to about one billion
O(log n) never sweatsImplications of Dominance
na dominates nb if a gt b sincelim n1
nb=na = nb1048576a 0 na + o(na) doesnrsquot dominate na since
lim n1na=(na + o(na)) 1
Algorithms Course Dr Aref Rashad
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
February 2013 91Algorithms Course Dr Aref Rashad
- Algorithm Course Dr Aref Rashad
- Course Objectives
- Slide 3
- Algorithms and Programs
- Algorithm Properties
- Slide 6
- Algorithm Specification
- Slide 8
- Slide 9
- Translating a Problem into an Algorithm
- Slide 11
- Slide 12
- Slide 13
- Slide 14
- Comparing Algorithms Empirical Approach
- Analytical Approach
- Slide 17
- Slide 18
- Space Complexity
- Slide 20
- Slide 21
- Slide 22
- Slide 23
- Slide 24
- Slide 25
- Slide 26
- Slide 27
- Slide 28
- Slide 29
- Slide 30
- Slide 31
- Slide 32
- Slide 33
- Slide 34
- Slide 35
- Slide 36
- Complexity Analysis
- Calculating Running Time T(n)
- Slide 39
- Slide 40
- Slide 41
- Slide 42
- Slide 43
- Slide 44
- Slide 45
- Slide 46
- Slide 47
- Slide 48
- Slide 49
- Slide 50
- Slide 51
- Slide 52
- Slide 53
- Slide 54
- Slide 55
- Slide 56
- Slide 57
- Slide 58
- Slide 59
- Slide 60
- Slide 61
- Slide 62
- Slide 63
- Slide 64
- Practical Considerations
- Remarks
- Remarks (2)
- Slide 68
- Slide 69
- Slide 70
- Slide 71
- Slide 72
- Slide 73
- Slide 74
- Slide 75
- Slide 76
- Slide 77
- Slide 78
- Slide 79
- Remarks (3)
- Slide 81
- Slide 82
- Slide 83
- Slide 84
- Slide 85
- Slide 86
- Slide 87
- Slide 88
- Slide 89
- Slide 90
- Slide 91
-
top related