正在加载图片...
450 SHLOMO HOORY.NATHAN LINIAL.AND AVI WIGDERSON These include the celebrated AKS sorting network [AKS83],and the variety of com- munication and computation networks which appear in [WZ93 and its extensive list of references. 1.3.2.Construction of good error correcting codes.We now turn to Shannon's prob- lem concerning communicating over a noisy channel and present a solution due to Sipser and Spielman [SS96].We observe a simple but useful property of magical graphs.Let G be such a graph with n left vertices and 3n/4 right vertices.We show that for every nonempty ScL with s=lSl≤品there exists a vertex u∈R with exactly one neighbor in S,namely,I(u)nS=1.To see this,consider e(S,I(S)), the number of edges between S and I(S).Clearly,e(S,r(S))=d.S=ds.On the other hand,since I(S)>5ds/8,the average number of neighbors in S that a vertex in I(S)has is at most 8/5<2.But every vertex in I(S)has at least one neighbor in S,so there must be some (indeed,many)vertices in I(S)with exactly one neighbor in S. We use the magical graph G to construct a code Cc {0,1]"with rate at least 1/4 and distance at least 1/10d.To this end,represent the magical graph G=(R,L,E)by a matrix A with row set R and column set L,where aij equals 1 or 0 depending on whether or not the i-th vertex in R is adjacent to the j-th vertex in L.The code is defined as the right kernel of A,viz.C=fz10,1)n Ax=0}. (Here calculations are done over the field with two elements.)Clearly C is a linear subspace of (0,1)"of dimension n/4.Hence CI>2/4,yielding the claimed lower bound on the code's rate. To prove a lower bound on the distance,first observe that since C is a linear code (i.e.a linear subspace of (0,1)")the smallest distance between two of its codewords equals the smallest weight of a nonzero codeword.Let x0 be an n-bit vector with support s={jL:zj=1}.If IS<10a,then,as we saw,there is some iE R with T(i)nS=1.It follows that the i-th coordinate in Ax is 1.and so z is not in the right kernel of A and cannot be a codeword.It follows that the normalized distance of C is at least 1/10d. The above construction is a special case of a so-called LDPC (for Low Density Parity Check)code.This idea was first suggested by Gallager [Gal63]and has inspired(among many others)the works by Bassalygo,Pinsker and Margulis Pin73 BP73,Mar73],the first to explicitly define expander graphs and construct them. After being nearly dormant for about 20 years,LDPC codes regained prominence in the 90's and are now believed to give simultaneously the best coding parameters as well as best algorithmic performance in various settings.For a survey of this fascinating field,see Richardson and Urbanke [RU]. Only fairly recently CRVW02 did the art of explicit constructions of expanding graphs reach the level that makes the above simple argument feasible.It should also be mentioned that this construction not only yields codes with linear distance but also linear time iterative decoding.We will review these "lossless expanders" in Section 10. As in the previous application,the time complexity of constructing the magical graph dominates the time to construct the(parity check matrix of the)appropriate code.This is yet another reason to seek efficient algorithms to construct these graphs.The next application calls for an even more concise and efficient description of these graphs.450 SHLOMO HOORY, NATHAN LINIAL, AND AVI WIGDERSON These include the celebrated AKS sorting network [AKS83], and the variety of com￾munication and computation networks which appear in [WZ93] and its extensive list of references. 1.3.2. Construction of good error correcting codes. We now turn to Shannon’s prob￾lem concerning communicating over a noisy channel and present a solution due to Sipser and Spielman [SS96]. We observe a simple but useful property of magical graphs. Let G be such a graph with n left vertices and 3n/4 right vertices. We show that for every nonempty S ⊂ L with s = |S| ≤ n 10d there exists a vertex u ∈ R with exactly one neighbor in S, namely, |Γ(u) ∩ S| = 1. To see this, consider e(S, Γ(S)), the number of edges between S and Γ(S). Clearly, e(S, Γ(S)) = d · |S| = ds. On the other hand, since Γ(S) ≥ 5ds/8, the average number of neighbors in S that a vertex in Γ(S) has is at most 8/5 < 2. But every vertex in Γ(S) has at least one neighbor in S, so there must be some (indeed, many) vertices in Γ(S) with exactly one neighbor in S. We use the magical graph G to construct a code C ⊂ {0, 1}n with rate at least 1/4 and distance at least 1/10d. To this end, represent the magical graph G = (R, L, E) by a matrix A with row set R and column set L, where aij equals 1 or 0 depending on whether or not the i-th vertex in R is adjacent to the j-th vertex in L. The code is defined as the right kernel of A, viz. C = {x ∈ {0, 1}n | Ax = 0}. (Here calculations are done over the field with two elements.) Clearly C is a linear subspace of {0, 1}n of dimension ≥ n/4. Hence |C| ≥ 2n/4, yielding the claimed lower bound on the code’s rate. To prove a lower bound on the distance, first observe that since C is a linear code (i.e. a linear subspace of {0, 1}n) the smallest distance between two of its codewords equals the smallest weight of a nonzero codeword. Let x = 0 be an n-bit vector with support S = {j ∈ L : xj = 1}. If |S| < n 10d , then, as we saw, there is some i ∈ R with |Γ(i) ∩ S| = 1. It follows that the i-th coordinate in Ax is 1, and so x is not in the right kernel of A and cannot be a codeword. It follows that the normalized distance of C is at least 1/10d. The above construction is a special case of a so-called LDPC (for Low Density Parity Check) code. This idea was first suggested by Gallager [Gal63] and has inspired (among many others) the works by Bassalygo, Pinsker and Margulis [Pin73, BP73, Mar73], the first to explicitly define expander graphs and construct them. After being nearly dormant for about 20 years, LDPC codes regained prominence in the 90’s and are now believed to give simultaneously the best coding parameters as well as best algorithmic performance in various settings. For a survey of this fascinating field, see Richardson and Urbanke [RU]. Only fairly recently [CRVW02] did the art of explicit constructions of expanding graphs reach the level that makes the above simple argument feasible. It should also be mentioned that this construction not only yields codes with linear distance but also linear time iterative decoding. We will review these “lossless expanders” in Section 10. As in the previous application, the time complexity of constructing the magical graph dominates the time to construct the (parity check matrix of the) appropriate code. This is yet another reason to seek efficient algorithms to construct these graphs. The next application calls for an even more concise and efficient description of these graphs
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有