正在加载图片...
ON-LINE LIST COLOURING OF RANDOM GRAPHS 3 Let Gg(n,p).Then,a.a.s., XP(G) 2l0g(np)x(G), where b=1/(1-p). Note that if p=o(1),then n nlog(1/(1-p)】。.np np 2l0gi(np) 2log(np) 2l0g(np) = log(np) For constant p it is not true that log(1/(1-p))~p but the order is preserved,provided p <1-e for some >0. For sparser graphs we are less successful in determining the asymptotic behaviour of xp(g(n,p)) Nevertheless,we can prove the following two theorems that determine the order of the graph parameter we study. Theorem 2.Let >0 be any constant,and suppose that (log n)2+e -≤p=0(1 og log n)1/3(1ogn)2n-1/3). Let GEG(n,p).Then,a.a.s., np XP(G)=0 =Θ(X(G) log(np) Moreover,if np=(logn)co(1),a.a.s. 2x(G) fC→0∞ X(G≤XP(G)≤(1+o(1) 二x(G)jC∈4,∞) 4x(G) fC∈(2,4). Finally,for very sparse graphs we have the following. Theorem 3.Let GEG(n,p)with p=(1/n).Then,a.a.s.,xp(G)=e(1)=e(x(G)). 2.PRELIMINARIES Most of the time,we will use the following version of Chernoff's bound.Suppose that X E Bin(n,p)is a binomial random variable with expectation u=np.If 0<6<1,then PrX<(1-6)4≤exp and ifδ>0, PrX>(1+)川≤ep( 2μ 2+6 However,at some point we will need the following,stronger,version:for any t>0,we have (3) Pr[X≥μ+t≤exp (() where p(x)=(1+x)log(1+x)-x. These inequalities are well known and can be found,for example,in [6].ON-LINE LIST COLOURING OF RANDOM GRAPHS 3 Let G ∈ G(n, p). Then, a.a.s., χP (G) ∼ n 2 logb (np) ∼ χ(G), where b = 1/(1 − p). Note that if p = o(1), then n 2 logb (np) = n log(1/(1 − p)) 2 log(np) ∼ np 2 log(np) = Θ  np log(np)  . For constant p it is not true that log(1/(1 − p)) ∼ p but the order is preserved, provided p ≤ 1 − ε for some ε > 0. For sparser graphs we are less successful in determining the asymptotic behaviour of χP (G(n, p)). Nevertheless, we can prove the following two theorems that determine the order of the graph parameter we study. Theorem 2. Let ε > 0 be any constant, and suppose that (log n) 2+ε n ≤ p = O((log log n) 1/3 (log n) 2n −1/3 ). Let G ∈ G(n, p). Then, a.a.s., χP (G) = Θ  np log(np)  = Θ(χ(G)). Moreover, if np = (log n) C+o(1), a.a.s. χ(G) ≤ χP (G) ≤ (1 + o(1))    2χ(G) if C → ∞ 2C C−2 χ(G) if C ∈ [4,∞) 4χ(G) if C ∈ (2, 4). Finally, for very sparse graphs we have the following. Theorem 3. Let G ∈ G(n, p) with p = O(1/n). Then, a.a.s., χP (G) = Θ(1) = Θ(χ(G)). 2. Preliminaries Most of the time, we will use the following version of Chernoff ’s bound. Suppose that X ∈ Bin(n, p) is a binomial random variable with expectation µ = np. If 0 < δ < 1, then Pr[X < (1 − δ)µ] ≤ exp  − δ 2µ 2  , and if δ > 0, Pr[X > (1 + δ)µ] ≤ exp  − δ 2µ 2 + δ  . However, at some point we will need the following, stronger, version: for any t ≥ 0, we have (3) Pr[X ≥ µ + t] ≤ exp  −µϕ  t µ  , where ϕ(x) = (1 + x) log(1 + x) − x. These inequalities are well known and can be found, for example, in [6]
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有