CSCI 3160 Design and Analysis of Algorithms Tutorial 1 Chengyu Lin ● ●1
CSCI 3160 Design and Analysis of Algorithms Tutorial 1 Chengyu Lin 1
About me ·Name:Chengyu Lin Email:cylin@cse.cuhk.edu.hk 。Office:SHB117 。○ffice hour:Friday14:00-16:00 You can always send me emails to make appointments ● ●2
About me • Name: Chengyu Lin • Email: cylin@cse.cuhk.edu.hk • Office: SHB 117 • Office hour: Friday 14:00 – 16:00 • You can always send me emails to make appointments 2
Asymptotic Notations 。O(g(n) oBig O o Asymptofic upper bound 2(g(n) o Big Omega o Asymptotic lower bound e(g(n)) o Big Theta o Asymptotic tight bound ● ●3
Asymptotic Notations • O(g(n)) o Big O o Asymptotic upper bound • Ω(g(n)) o Big Omega o Asymptotic lower bound • Θ(g(n)) o Big Theta o Asymptotic tight bound 3
Asymptotic Notations The time (space)complexity of an algorithm usually depends on the input size,where the input could be o verfices/edges of a graph o a sequence of integers Usually the running time of algorithm grows with the input size Usually the running time is written as a function f(n),where n is the input size ● ●4
Asymptotic Notations • The time (space) complexity of an algorithm usually depends on the input size, where the input could be o vertices/edges of a graph o a sequence of integers • Usually the running time of algorithm grows with the input size • Usually the running time is written as a function t(n), where n is the input size 4
Asymptotic Notations Suppose we want to analyze the running time of a program,e.g. for (int i=1;i <n;++i){ for (int j=i;<=n;++j) for (int r 1;r <n;++r){ for (int s =r;s <n;++s){ puts("hello"); 。This takes t(n)=n(n+1)/2·n(n+1)/2= n2(n+1)2/4 steps. Calculating the exact running time is tedious. ●5
Asymptotic Notations • Suppose we want to analyze the running time of a program, e.g. • This takes t(n) = n(n+1)/2 · n(n+1)/2 = n2 (n+1)2/4 steps. • Calculating the exact running time is tedious. for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { for (int r = 1; r <= n; ++r) { for (int s = r; s <= n; ++s) { puts(“hello”); } } } } 5
Asymptotic Notations Asymptotic notation simplifies the calculations for (int i=1;i<=n;++i){ for (int j=i;j<=n;++j) { for (int r 1;r <n;++r){ for (int s r;s <n;++s){ puts ("hello"); 。This takes f(n)=O(n2)·O(n2)=O(n4)steps.. and yet it still captures the behavior of t(n) very well when the n is "large"enough ● ●6
Asymptotic Notations • Asymptotic notation simplifies the calculations • This takes t(n) = O(n2 ) · O(n2 ) = O(n4 ) steps. • and yet it still captures the behavior of t(n) very well when the n is “large” enough for (int i = 1; i <= n; ++i) { for (int j = i; j <= n; ++j) { for (int r = 1; r <= n; ++r) { for (int s = r; s <= n; ++s) { puts(“hello”); } } } } 6
cg(n) Big O notation f(n) ·Ve say that t(n)=○(gn)if there exists a constant c 0 such that o f(n)=O(g(n)) t(n)≤c·gln)for every sufficiently large n Intuitively,if f(n)is the running time,it tells us the running time cannot be worse than g(n)by a constant multiplicative factor Usually,g(n)looks simpler than t(n)(e.g.g(n) n2,n!,log n,... ● ●7
Big O notation • We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n • Intuitively, if t(n) is the running time, it tells us – the running time cannot be worse than g(n) by a constant multiplicative factor • Usually, g(n) looks simpler than t(n) (e.g. g(n) = n2 , n!, log n, …) 7
cg(n) Big O notation f(n) Ve say that t(n)=○(gn)if there exists a constant c 0 such that no f(n)=0(g(n)) t(n)≤c·g(n)for every sufficiently large n 1.The inequality only needs to hold for large n e.g.n2 =O(2n) 3n2 2n is not true when n 3 -It holds for every n≥4,soit's okay ● ●8
Big O notation • We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n 1. The inequality only needs to hold for large n – e.g. n 2 = O(2n ) – 3n 2 ≤ 2 n is not true when n = 3 – It holds for every n ≥ 4, so it’s okay 8
cg(n) Big O notation f(n) 。Ve say that t(n)=○(gn)if there exists a constant c 0 such thatL f (n)=O(g(n)) t(n)≤c·glnl)for every sufficiently large n 2.We can multiply g(n)by a positive constant -e.g.4n2=0n2) -4n2 <n2 is not true for any n But 4n2<4n2 holds,so it's okay ● ●9
Big O notation • We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n 2. We can multiply g(n) by a positive constant – e.g. 4n 2 = O(n 2 ) – 4n 2 ≤ n 2 is not true for any n – But 4n 2 ≤ 4n 2 holds, so it’s okay 9
cg(n) Big O notation f(n) ·Ve say that t(n)=○(gn)if there exists a constant c 0 such that L f (n)=O(g(n)) tn)≤c·gln)for every sufficiently large n 3.g(n)can be any upper bound -e.g.n2=0(n2) It is also true to say n2=O(n100)or n2=O(2) ● ●10
Big O notation • We say that t(n) = O(g(n)) if there exists a constant c > 0 such that t(n) ≤ c · g(n) for every sufficiently large n 3. g(n) can be any upper bound – e.g. n 2 = O(n 2 ) – It is also true to say n 2 = O(n 100) or n 2 = O(2n ) 10