正在加载图片...
Sums, Approximations, and Asymptotics II Proof. We must show that there exist constants to and c>0 such that 5.x +100<c c for lx >to Let c=10 and zo 20 and note that: 5x+100≤5+5x=10x for all x >20 Here's another claim that points out a very common error involving O notation Claim 3. =O() Proof. We must show that there exist constants To and c>0 such that a <c a2 for all >ao. Let c= l and co 1 and note that r|≤1.x2 for all x >1 Many people fail to realize that a statement of the form f(a)=o(g(a)) only places an upper bound on the growth of the function f. So, for example, you'll often hear people say things like, I can t use an algorithm that runs in O(a2)steps because that' s too slow People who say things like that are dumb and you should make fun of them until they cry. Okay, maybe not. But they are incorrect; we just proved that a fast algorithm that requires only a steps also runs in time O(z2)! One properly expresses a lower bound using the S notation, which we'll come to presently. What about the reverse of the previous claim? Is 2=O()? On an informal basis, this means z 2 grows no faster than c, which is false. Lets prove this formally Claim4.x2≠O(x) Proof. We argue by contradiction; suppose that there exist constants co and c such that C·x or Dividing both sides of the inequality by a gives <c for all x But this is false when r= 1+ max(co, c) We can show that 2+ O(100.)by essentially the same argument; intuitively, x grows quadratically, while 100r grows only linearly. Generally, changes in multiplicative con- stants do not affect the validity of an assertion involving O. However, constants in expo- nentials can be critical: Claim 5 4≠O(2)� � 10 Sums, Approximations, and Asymptotics II Proof. We must show that there exist constants x0 and c > 0 such that |5x + 100| ≤ c · x for all x > x0. Let c = 10 and x0 = 20 and note that: | | 5x + 100 ≤ 5x + 5x = 10x for all x > 20 Here’s another claim that points out a very common error involving O notation. Claim 3. x = O(x2) Proof. We must show that there exist constants x0 and c > 0 such that |x| ≤ c · x2 for all x > x0. Let c = 1 and x0 = 1 and note that | | ≤ x x 1 · for all x > 1 2 Many people fail to realize that a statement of the form f(x) = O(g(x)) only places an upper bound on the growth of the function f. So, for example, you’ll often hear people say things like, “I can’t use an algorithm that runs in O(x2) steps because that’s too slow.” People who say things like that are dumb and you should make fun of them until they cry. Okay, maybe not. But they are incorrect; we just proved that a fast algorithm that requires 2 only x steps also runs in time O(x )! One properly expresses a lower bound using the Ω notation, which we’ll come to presently. What 2 about the reverse of the previous claim? Is x = O(x)? On an informal basis, this means x2 grows no faster than x, which is false. Let’s prove this formally. 2 Claim 4. x = O(x) Proof. We argue by contradiction; suppose that there exist constants x0 and c such that: 2 |x | ≤ c · x for all x > x0 Dividing both sides of the inequality by x gives: x ≤ c for all x > x0 But this is false when x = 1 + max(x0, c). We 2 2 can show that x �= O(100x) by essentially the same argument; intuitively, x grows quadratically, while 100x grows only linearly. Generally, changes in multiplicative con￾stants do not affect the validity of an assertion involving O. However, constants in expo￾nentials can be critical: Claim 5. 4x = O(2x)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有