当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

香港中文大学:《Quantum Computing》课程教学资源(课件讲稿)Lecture 7 Quantum communication complexity 1

资源类别:文库,文档格式:DOCX,文档页数:8,文件大小:46.09KB,团购合买
点击下载完整版文档(DOCX)

Quantum Computing (fall2013) Instructor:Shengyu Zhang. Lecture 7 Quantum communication complexity:upper bounds 1 Classical communication complexity:a brief introduction 1.1 models and definitions In communication complexity,we consider the problem of computing a function with input variables distributed to two(or more)parties.Since each party only holds part of the input,they need to communicate in order to compute the function value.The communication complexity measures the minimum communication needed. The most common communication models include two-way:two parties,Alice and Bob,holding inputs x and y,respectively,talk to each other back and forth one-way:two parties,Alice and Bob,holding inputs x and y,respectively,but only Alice sends a message to Bob. Simultaneous Message Passing (SMP):Three parties:Alice,Bob and Referee.Alice and Bob hold inputs x and y,respectively,and they each send one message to Referee,who then outputs an answer multiparty:k parties,each holding some input variables and they communicate in a certain way. Some example functions are: Equality (EQn):To decide whether x =y,where x,y E(0,1)n. Inner Product (IPn):To compute (x,y)=x1y+..+xnyn mod 2,where x,y E{0,1)n Disjointness(DISJn):To decide whether 3i s.t.xi=yi=1 for x,y E(0,1]n The computation modes include deterministic:all messages are deterministically specified in the protocol.Namely,each message of any party depends only on her input and the previous messages that she saw. randomized:parties can use randomness to decide the messages they send. quantum:the parties have quantum computers and they send quantum messages

Quantum Computing (Fall 2013) Instructor: Shengyu Zhang. Lecture 7 Quantum communication complexity: upper bounds 1 Classical communication complexity: a brief introduction 1.1 models and definitions In communication complexity, we consider the problem of computing a function with input variables distributed to two (or more) parties. Since each party only holds part of the input, they need to communicate in order to compute the function value. The communication complexity measures the minimum communication needed. The most common communication models include  two-way: two parties, Alice and Bob, holding inputs 𝑥 and 𝑦, respectively, talk to each other back and forth.  one-way: two parties, Alice and Bob, holding inputs 𝑥 and 𝑦, respectively, but only Alice sends a message to Bob.  Simultaneous Message Passing (SMP): Three parties: Alice, Bob and Referee. Alice and Bob hold inputs 𝑥 and 𝑦, respectively, and they each send one message to Referee, who then outputs an answer.  multiparty: 𝑘 parties, each holding some input variables and they communicate in a certain way. Some example functions are:  Equality (𝐄𝐐𝑛): To decide whether 𝑥 = 𝑦, where 𝑥,𝑦 ∈ {0,1} 𝑛.  Inner Product (𝐈𝐏𝑛): To compute 〈𝑥,𝑦〉 = 𝑥1𝑦1 + ⋯ + 𝑥𝑛𝑦𝑛 mod 2, where 𝑥,𝑦 ∈ {0,1} 𝑛.  Disjointness (𝐃𝐈𝐒𝐉𝑛): To decide whether ∃𝑖 s.t. 𝑥𝑖 = 𝑦𝑖 = 1 for 𝑥,𝑦 ∈ {0,1} 𝑛. The computation modes include  deterministic: all messages are deterministically specified in the protocol. Namely, each message of any party depends only on her input and the previous messages that she saw.  randomized: parties can use randomness to decide the messages they send.  quantum: the parties have quantum computers and they send quantum messages

We say that a protocol computes the function if on all possible input instances,at the end of the protocol,the party who receives the last message knows the function value on the input.(Sometimes for two-way communication model we assume that both parties know the function value.This doesn't change the complexity much since for Boolean functions,one party can always send the answer to the other party by just one bit.)For randomized protocols,we sometimes allow a small error probability. The cost of a protocol is measured by the number of communication bits.The commmication complexity ofa function f is the minimum cost of any protocol that computes the function.We ignore the issue of local computation cost by assuming that all parties are computationally all-powerful.Though all the computation factors are gone and only commmication is focused on,it turns out to be very helpful for communication complexity to be exploited to prove impossibility results for other computational complexity. 1.2 An efficient deterministic protocol Let's consider the basic case oftwo parties.Note that ifx and y are both at most n-bit long,then at the very least Alice can send her input to Bob,who then can compute f(x,y)since he has both inputs.Finally he tells Alice the answer by I bit communication.Thus the communication complexity of any 2n-bit function is at most n+1. What makes communication complexity interesting on its own is that some functions enjoy much more efficient protocols Example 1.Max.Alice and Bob hold,y E [0,1)n,respectively.Interpret x as a subset of [n] {1,2,...,n};the subset contains the positions i s.t.xi=1.Similarly for y.The task is to compute max(x,y),the maximum number in x Uy. If Alice sends her entire input x,it takes n communication bits.But she can do something better: She merely needs to send the maximal number in her set x,which only takes logn bits.Bob then compares this value with his maximal number in y.Thus the protocol only takes logn bits of communication.(If we want both parties to know the answer,then Bob can send the answer back to Alice which takes extra logn bits of communication.)This is much more efficient than sending n bits This protocol seems trivial,but some others need more thoughts. Example 2.Median.Alice and Bob again hold subsets x,y E(0,1)m,respectively.But this time they wish to compute the median of the multi-set x U y.Recall that the median of k numbers is defined as the [k/2]-th smallest number

We say that a protocol computes the function if on all possible input instances, at the end of the protocol, the party who receives the last message knows the function value on the input. (Sometimes for two-way communication model we assume that both parties know the function value. This doesn’t change the complexity much since for Boolean functions, one party can always send the answer to the other party by just one bit.) For randomized protocols, we sometimes allow a small error probability. The cost of a protocol is measured by the number of communication bits. The communication complexity of a function 𝑓 is the minimum cost of any protocol that computes the function. We ignore the issue of local computation cost by assuming that all parties are computationally all-powerful. Though all the computation factors are gone and only communication is focused on, it turns out to be very helpful for communication complexity to be exploited to prove impossibility results for other computational complexity. 1.2 An efficient deterministic protocol Let’s consider the basic case of two parties. Note that if 𝑥 and 𝑦 are both at most 𝑛-bit long, then at the very least Alice can send her input to Bob, who then can compute 𝑓(𝑥, 𝑦) since he has both inputs. Finally he tells Alice the answer by 1 bit communication. Thus the communication complexity of any 2𝑛-bit function is at most 𝑛 + 1. What makes communication complexity interesting on its own is that some functions enjoy much more efficient protocols. Example 1. Max. Alice and Bob hold , 𝑦 ∈ {0,1} 𝑛 , respectively. Interpret 𝑥 as a subset of [𝑛] = {1,2, …, 𝑛}; the subset contains the positions 𝑖 s.t. 𝑥𝑖 = 1. Similarly for 𝑦. The task is to compute max(𝑥,𝑦), the maximum number in 𝑥 ∪ 𝑦. If Alice sends her entire input 𝑥, it takes 𝑛 communication bits. But she can do something better: She merely needs to send the maximal number in her set 𝑥, which only takes log 𝑛 bits. Bob then compares this value with his maximal number in 𝑦. Thus the protocol only takes log 𝑛 bits of communication. (If we want both parties to know the answer, then Bob can send the answer back to Alice which takes extra log 𝑛 bits of communication.)This is much more efficient than sending 𝑛 bits. This protocol seems trivial, but some others need more thoughts. Example 2. Median. Alice and Bob again hold subsets 𝑥,𝑦 ∈ {0,1} 𝑛 , respectively. But this time they wish to compute the median of the multi-set 𝑥 ∪ 𝑦. Recall that the median of 𝑘 numbers is defined as the ⌊𝑘/2⌋-th smallest number

An efficient protocol is as follows.At any point,they maintain an interval [l,u]which guaranteesto contain the median.Initially l=1 and u =n.They halve the interval in each round,so it takes only logn rounds to finish.How do they halve the interval?Actually very simple:Alice sends to Bob the number ofher elements that are at least m=(l+u)/2,and the number ofelements that are less than m.This information is enough for Bob to know whether the median is above m or not.(Verify this!) Bob uses I bit to indicate the answer,and then the protocol goes to the next round. Since each round hasonly O(logn)bits,and there are only logn rounds,the total communication cost is 0(log2n),much more efficient than the trivial protocol ofcost n +1. 1.3 Hardness for deterministic and saving by randomness We've seen examples of efficient protocols.On the other hand,for some problems,there isn't any protocol with communication less than n bits. First,let's observe that a deterministic communication protocol partitions the communication matrix into monochromatic rectangles.Here for a function f(x,y),the communication matrix Mr is defined by [f(x,y)x namely the rows are indexed by x and the columns are indexed by y,and the (x,y)-entry is f(x,y).A rectangle is a pair of subsets (S,T)where S is a set of rowsand T is a set of columns.A rectangle (S,T)is monochromatic if Mr restricted on it is a submatrix containing only one value.(Namely,all entries in the submatrix are of the same value.)For any value b in range(f),a b-rectangle is a monochromatic rectangle in whichall entries of Mr are of value b.By considering how the protocol goes round-by-round,we can observe the following fact. Fact 1.A c-bit deterministic communication protocol for f partitions the communication matrix Mp into 2c monochromatic rectangles. Example 3.Equality. The above fact enables us to show that there is no efficient protocol for solving the Equality function EQn.Indeed,the communication matrix for EQn is IN,the identity matrix of size N=2n.Notice that for this matrix,all I-rectangles are of size Ix1.So no matter how one partitions the matrix into monochromatic rectangles,there are always N 1-rectangles.Considering that there are also 0-rectangles,we have 2c>N =2n,so c>n.We have thus proved the following theorem Theorem 2.Any deterministic protocol computing EQn needs n+1 bits of communication

An efficient protocol is as follows. At any point, they maintain an interval [𝑙,𝑢] which guarantees to contain the median. Initially 𝑙 = 1 and 𝑢 = 𝑛. They halve the interval in each round, so it takes only log𝑛 rounds to finish. How do they halve the interval? Actually very simple: Alice sends to Bob the number of her elements that are at least 𝑚 = (𝑙 + 𝑢)/2, and the number of elements that are less than 𝑚. This information is enough for Bob to know whether the median is above 𝑚 or not. (Verify this!) Bob uses 1 bit to indicate the answer, and then the protocol goes to the next round. Since each round has only O(log𝑛) bits, and there are only log𝑛 rounds, the total communication cost is O(log2 𝑛), much more efficient than the trivial protocol of cost 𝑛 + 1. 1.3 Hardness for deterministic and saving by randomness We've seen examples of efficient protocols. On the other hand, for some problems, there isn't any protocol with communication less than 𝑛 bits. First, let’s observe that a deterministic communication protocol partitions the communication matrix into monochromatic rectangles. Here for a function 𝑓(𝑥, 𝑦), the communication matrix 𝑀𝑓 is defined by [𝑓(𝑥,𝑦)]𝑥,𝑦, namely the rows are indexed by 𝑥 and the columns are indexed by 𝑦, and the (𝑥,𝑦)-entry is 𝑓(𝑥, 𝑦). A rectangle is a pair of subsets (𝑆,𝑇) where 𝑆 is a set of rows and 𝑇 is a set of columns. A rectangle (𝑆,𝑇) is monochromatic if 𝑀𝑓 restricted on it is a submatrix containing only one value. (Namely, all entries in the submatrix are of the same value.) For any value 𝑏 in 𝑟𝑎𝑛𝑔𝑒(𝑓), a 𝑏-rectangle is a monochromatic rectangle in which all entries of 𝑀𝑓 are of value 𝑏. By considering how the protocol goes round-by-round, we can observe the following fact. Fact 1. A 𝑐-bit deterministic communication protocol for 𝑓 partitions the communication matrix 𝑀𝑓 into 2 𝑐 monochromatic rectangles. Example 3. Equality. The above fact enables us to show that there is no efficient protocol for solving the Equality function 𝐄𝐐𝑛. Indeed, the communication matrix for 𝐄𝐐𝑛 is 𝐼𝑁, the identity matrix of size 𝑁 = 2 𝑛. Notice that for this matrix, all 1-rectangles are of size 11. So no matter how one partitions the matrix into monochromatic rectangles, there are always 𝑁 1-rectangles. Considering that there are also 0-rectangles, we have 2 𝑐 > 𝑁 = 2 𝑛, so 𝑐 > 𝑛. We have thus proved the following theorem. Theorem 2. Any deterministic protocol computing 𝐄𝐐𝑛 needs 𝑛 + 1 bits of communication

Despite the above negative result,next we'll see that the Equality function doesenjoy a very efficient (and one-way!)protocol if we are allowed to use some randomness in the protocol.Let's assume that Alice and Bob have shared randomness;we call such protocols public-coin. Theorem 3.There is a one-way public-coin randomized protocol computing EQn with only 0(1) bits of communication.(On any input (x,y),the protocol outputs the correct f(x,y)with probability 99%.) Let's first design a protocol with 50%1-sided error probability,and later boost the success probability. The protocol is as follows. ·Alice: Compute (x,r)=x+.+xn mod 2,where r is a public random string oflength n, Send the 1-bit result to Bob. ·Bob: Compute (y,r)=yir+...+ynrn mod 2,where r is the same random string that Alice used. Compare(x,r)andy,t〉.Output“x=y”if(x,r)=y,r)and“x≠y”if(x,r〉≠ y,T〉 Note that the communication cost of the protocol is only 1 bit.Let's see what the protocol does.To compare x and y using little communication,Alice tries to give a very short summary of her input x, and send only this summary to Bob,who computes the same summary of his input y.The summary is so short that it's only I bit,so it contains very very little information of the input string.So...could this possibly work?Next we'll see that though it's a small summary,it contains quite some information about the identity of the string.For this reason,the summary is usually called a fingerprint. Let's analyze it case by case.If x=y,then of course (x,r)=(y,r),and the protocol is correct with certainty.Ifx≠y,we claim that(x,r)≠y,r)with probability exactly I/2 !(Namely,ifx≠y, then with half probability,the 1-bit summary can catch this distinction.)Actually,when xy,they differ at at least one position.Say it's position i.then (x,r)-y,r)=(x-y,r)=(x1-yh)m+…+(xn-yn)rn mod2 =x1-ym+∑jti(x-y)ymod2=+∑jti(x-y)万mod2

Despite the above negative result, next we’ll see that the Equality function does enjoy a very efficient (and one-way!) protocol if we are allowed to use some randomness in the protocol. Let’s assume that Alice and Bob have shared randomness; we call such protocols public-coin. Theorem 3. There is a one-way public-coin randomized protocol computing 𝐄𝐐𝑛 with only 𝑂(1) bits of communication. (On any input (𝑥, 𝑦), the protocol outputs the correct 𝑓(𝑥, 𝑦) with probability 99%.) Let’s first design a protocol with 50% 1-sided error probability, and later boost the success probability. The protocol is as follows.  Alice: ▪ Compute 〈𝑥, 𝑟〉 = 𝑥1𝑟1 + ⋯ + 𝑥𝑛𝑟𝑛 mod 2, where 𝑟 is a public random string of length 𝑛, ▪ Send the 1-bit result to Bob.  Bob: ▪ Compute 〈𝑦, 𝑟〉 = 𝑦1𝑟1 + ⋯ + 𝑦𝑛𝑟𝑛 mod 2, where 𝑟 is the same random string that Alice used. ▪ Compare 〈𝑥, 𝑟〉 and 〈𝑦, 𝑟〉 Output “𝑥 = 𝑦” if 〈𝑥, 𝑟〉 = 〈𝑦, 𝑟〉 and “𝑥 ≠ 𝑦” if 〈𝑥, 𝑟〉 ≠ 〈𝑦, 𝑟〉 Note that the communication cost of the protocol is only 1 bit. Let’s see what the protocol does. To compare 𝑥 and 𝑦 using little communication, Alice tries to give a very short summary of her input 𝑥, and send only this summary to Bob, who computes the same summary of his input 𝑦. The summary is so short that it’s only 1 bit, so it contains very very little information of the input string. So … could this possibly work? Next we’ll see that though it’s a small summary, it contains quite some information about the identity of the string. For this reason, the summary is usually called a fingerprint. Let’s analyze it case by case. If 𝑥 = 𝑦, then of course 〈𝑥, 𝑟〉 = 〈𝑦, 𝑟〉, and the protocol is correct with certainty. If 𝑥 ≠ 𝑦, we claim that 〈𝑥, 𝑟〉 ≠ 〈𝑦, 𝑟〉 with probability exactly 1/2! (Namely, if 𝑥 ≠ 𝑦, then with half probability, the 1-bit summary can catch this distinction.) Actually, when 𝑥 ≠ 𝑦, they differ at at least one position. Say it’s position i. then 〈𝑥, 𝑟〉 − 〈𝑦, 𝑟〉 = 〈𝑥 − 𝑦, 𝑟〉 = (𝑥1 − 𝑦1 )𝑟1 + ⋯ + (𝑥𝑛 − 𝑦𝑛 )𝑟𝑛 mod 2 = (𝑥𝑖 − 𝑦𝑖 )𝑟𝑖 + ∑ (𝑥𝑗 − 𝑦𝑗 𝑗≠𝑖 )𝑟𝑗 mod 2 = rj + ∑ (𝑥𝑗 − 𝑦𝑗 𝑗≠𝑖 )𝑟𝑗 mod 2

Now notice that r takes value 0 and 1 each with halfprobability,so regardless of the value of the second item,the summation is I always with probability half.Thus with probability half the protocol detects that x≠y. To make the error probability smaller,one can simply repeat the above protocol k times,dropping the error probability to 1/2k In the above protocol we assumed that Alice and Bob have shared randomness.There is a theorem(by Newman)saying that in two-way or one-way models,allowing public-coin randomness doesn't change the complexity by an additive Oflogn)amount,compared to private-coin randomness where each party has her own and private randomness. In the SMP model,however,things are different.The above procedure works if Alice and Bob share public randomness r---They just send (x,r)and (y,r)respectively to Referee,who then compares these two fingerprints.What if they do not share public randomness?Then it turns out that the communication complexity for EQn in the SMP model is (vn).Both upper and lower bounds are nontrivial,and we'll skip the proofs here.(It's in a reading project.) 2 Quantum communication complexity 2.1 Models We define two party two-way model,and others are similar.A protocol goes as follows.Alice performs a unitary operation U on her register A and the channel register C,and then sends the channel register C to Bob.Bob then performsa unitary operation V on his register B and the channel register C,and sends the channel register C back to Alice.And the protocol goes on.The communication cost is the number of qubits of C times the number oftimes C is sent(in either direction).The e-error quantum communication complexity is the minimum communication cost of any quantum protocol that computes the function with error probability at most e. We will use Q(f)to denote the e-error quantum communication complexity for f.We usually omit the subscript e if e=1/3

Now notice that 𝑟𝑖 takes value 0 and 1 each with half probability, so regardless of the value of the second item, the summation is 1 always with probability half. Thus with probability half the protocol detects that 𝑥 ≠ 𝑦. To make the error probability smaller, one can simply repeat the above protocol k times, dropping the error probability to 1/2 𝑘 . In the above protocol we assumed that Alice and Bob have shared randomness. There is a theorem (by Newman) saying that in two-way or one-way models, allowing public-coin randomness doesn’t change the complexity by an additive O(log𝑛) amount, compared to private-coin randomness where each party has her own and private randomness. In the SMP model, however, things are different. The above procedure works if Alice and Bob share public randomness 𝑟---They just send 〈𝑥, 𝑟〉 and 〈𝑦, 𝑟〉 respectively to Referee, who then compares these two fingerprints. What if they do not share public randomness? Then it turns out that the communication complexity for 𝐄𝐐𝑛 in the SMP model is Θ(√𝑛). Both upper and lower bounds are nontrivial, and we’ll skip the proofs here. (It’s in a reading project.) 2 Quantum communication complexity 2.1 Models We define two party two-way model, and others are similar. A protocol goes as follows. Alice performs a unitary operation 𝑈1 on her register A and the channel register C, and then sends the channel register C to Bob. Bob then performs a unitary operation 𝑉1 on his register B and the channel register C, and sends the channel register C back to Alice. And the protocol goes on. The communication cost is the number of qubits of C times the number of times C is sent (in either direction). The 𝜖-error quantum communication complexity is the minimum communication cost of any quantum protocol that computes the function with error probability at most 𝜖. We will use 𝑄𝜖(𝑓) to denote the 𝜖-error quantum communication complexity for 𝑓. We usually omit the subscript 𝜖 if 𝜖 = 1/3

2.2 Protocol for Disjointness A classic result for Disjn is that the randomized communication complexity(bounded error, two-way communication)is 0(n).Namely, Theorem4.Q(Disjn)=0(vnlogn). Proof.The protocol is actually pretty simple.Just run Grover's search to search for a position i E [n] s.t.xi=yi=1.The Grover's search algorithm is a query algorithm,which uses an oracle 0:∑l0→∑(-140 Note that Disin is nothing but OR(z)where z=xAy.So it is enough to simulate the oracle Oxy in order to invoke Grover's search.Can we use efficient communication to simulate Oxy?Yes. Actually Alice can do∑iiO)→∑iaili)lxi〉and send the working space(which has only logn qubits)to Bob,who can do iaili)xiA yi).Then Bob does iai(-1)xiAyi li)xiA yi).Finally they erase IxiA yi)by doing the first two steps again: ∑a4-)Ay一∑a-woI6AWo6Aw》=∑a-iI0I0 3 SMP communication and quantum fingerprint Recall that the communication complexity for EQ in the private-coin SMP model is). We'll show that if we use quantum communication,then the complexity can be reduced to O(logn). First,let's review the notion of error correcting code.Suppose that a sender A wishes to send an n-bit string xE(0,1)n to a receiver B through a noisy channel.Suppose the message is E(x).The channel flips each bit of E(x)independently with probability at most a small constant e.To cope with the noise,A needs to add some redundancy.What's the minimum length of the message E(x)s.t.B can recover the correct x with probability 99%?The answer is m =0(n).Check:what if we add e and the recovery probability in?Fix such an error correcting code E(x). We'll need another tool called Swap-Test,which measures how close two unknown states are. Suppose that we are given two states,)in register A and l)in register B,and we want to estimate ()Here is the test. Swap-Test:

2.2 Protocol for Disjointness A classic result for 𝐃𝐢𝐬𝐣𝑛 is that the randomized communication complexity (bounded error, two-way communication) is 𝛺(𝑛). Namely, Theorem 4. 𝑄(𝐃𝐢𝐬𝐣𝑛 ) = O(√𝑛log 𝑛). Proof. The protocol is actually pretty simple. Just run Grover’s search to search for a position 𝑖 ∈ [𝑛] s.t. 𝑥𝑖 = 𝑦𝑖 = 1. The Grover’s search algorithm is a query algorithm, which uses an oracle 𝑂𝑧 :∑𝛼𝑖 |𝑖〉 𝑖 → ∑𝛼𝑖 (−1) 𝑧𝑖|𝑖〉 𝑖 Note that 𝐃𝐢𝐬𝐣𝑛 is nothing but 𝑂𝑅(𝑧) where 𝑧 = 𝑥 ∧ 𝑦. So it is enough to simulate the oracle 𝑂𝑥∧𝑦 in order to invoke Grover’s search. Can we use efficient communication to simulate 𝑂𝑥∧𝑦? Yes. Actually Alice can do ∑ 𝛼𝑖 |𝑖〉|0〉 𝑖 → ∑ 𝛼𝑖 |𝑖〉 𝑖 |𝑥𝑖 〉 and send the working space (which has only log𝑛 qubits) to Bob, who can do ∑ 𝛼𝑖 |𝑖〉|𝑥𝑖 ∧ 𝑦𝑖 〉 𝑖 . Then Bob does ∑ 𝛼𝑖 (−1) 𝑥𝑖∧𝑦𝑖 |𝑖〉|𝑥𝑖 ∧ 𝑦𝑖 〉 𝑖 . Finally they erase |𝑥𝑖 ∧ 𝑦𝑖 〉 by doing the first two steps again: ∑𝛼𝑖 (−1) 𝑥𝑖∧𝑦𝑖|𝑖〉|𝑥𝑖 ∧ 𝑦𝑖 〉 𝑖 → ∑𝛼𝑖 (−1) 𝑥𝑖∧𝑦𝑖 |𝑖〉|(𝑥𝑖 ∧ 𝑦𝑖 ) ⊕(𝑥𝑖 ∧ 𝑦𝑖)〉 𝑖 = ∑𝛼𝑖 (−1) 𝑥𝑖∧𝑦𝑖 |𝑖〉|0〉 𝑖 3 SMP communication and quantum fingerprint Recall that the communication complexity for 𝐄𝐐𝑛 in the private-coin SMP model is Θ(√𝑛). We’ll show that if we use quantum communication, then the complexity can be reduced to O(log𝑛). First, let’s review the notion of error correcting code. Suppose that a sender A wishes to send an 𝑛-bit string 𝑥 ∈ {0,1} 𝑛 to a receiver B through a noisy channel. Suppose the message is 𝐸(𝑥). The channel flips each bit of 𝐸(𝑥) independently with probability at most a small constant 𝜖. To cope with the noise, A needs to add some redundancy. What’s the minimum length of the message 𝐸(𝑥) s.t. B can recover the correct 𝑥 with probability 99%? The answer is 𝑚 = 𝑂(𝑛). Check: what if we add 𝜖 and the recovery probability in? Fix such an error correcting code 𝐸(𝑥). We’ll need another tool called Swap-Test, which measures how close two unknown states are. Suppose that we are given two states, |𝜓〉 in register A and |𝜙〉 in register B, and we want to estimate 〈𝜓|𝜙〉. Here is the test. Swap-Test:

Attach an extra qubit=in register C. Conditioned on |1)in register C,swap content in A and B.Here swap is the operation la川b〉→lb)la) Measure the register C in {+)-)basis,and the test is considered to pass if the outcome is|+). To get some intuition of the test,let's consider a special case first.If)=l),then the second step has no effect,and thus the measurement in the third step passes with probability 1.In general,note that after Step 2,the state becomes a0wo+1ow》-a(他是ei6+之w) 2 =+)o〉+1p2+-)〉=1 2 2 So the measurement in Step 3 gives a -)with probability ®o≥6-e1-e60D0w6)-wIw)=2a-Kw1op) 2 Now we give a quantum protocol in private-coin SMP model for EQn.Actually,Alice and Bob do not even use any randomness. ●Alice:Send l中x)=∑li)lE(x)i) ●Bob:Send中y〉=∑1lilE6y)i Referee::Apply Swap-Test on{lψ,lp,y}.Output“x=y”if the test passed and output x≠y”if the test didn't pass. The analysis of the protocol is easy.If x =y,then)=)and the Swap-Test passes with probability 1.So Referee outputs)=ly.以Ifx≠y,then not only l中x)≠lyy〉but they are far apart: o,,=元:E=Eo)D≥1-a as guaranteed by the error correcting code.Thus the Swap-Test fails with n(1)probability. (As always,one can repeat the protocol to reduce the error probability exponentially.)

⚫ Attach an extra qubit |+〉 = 1 √2 (|0〉 + |1〉) in register C. ⚫ Conditioned on |1〉 in register C, swap content in A and B. Here swap is the operation |𝑎〉|𝑏〉 → |𝑏〉|𝑎〉. ⚫ Measure the register C in {|+〉, |−〉} basis, and the test is considered to pass if the outcome is |+〉. To get some intuition of the test, let’s consider a special case first. If |𝜓〉 = |𝜙〉, then the second step has no effect, and thus the measurement in the third step passes with probability 1. In general, note that after Step 2, the state becomes 1 √2 (|0〉|𝜓〉|𝜙〉 + |1〉|𝜙〉|𝜓〉) = 1 √2 ( |+〉 + |−〉 √2 |𝜓〉|𝜙〉 + |+〉 − |−〉 √2 |𝜙〉|𝜓〉) = |+〉 |𝜓〉|𝜙〉 + |𝜙〉|𝜓〉 2 + |−〉 |𝜓〉|𝜙〉 − |𝜙〉|𝜓〉 2 So the measurement in Step 3 gives a |−〉 with probability ‖ |𝜓〉|𝜙〉 − |𝜙〉|𝜓〉 2 ‖ 2 = 1 4 (〈𝜓|〈𝜙| − 〈𝜙|〈𝜓|)(|𝜓〉|𝜙〉 − |𝜙〉|𝜓〉) = 1 2 (1 − |⟨𝜓|𝜙⟩| 2 ) Now we give a quantum protocol in private-coin SMP model for 𝐄𝐐𝑛. Actually, Alice and Bob do not even use any randomness. ⚫ Alice: Send |𝜓𝑥 〉 = ∑ |𝑖〉|𝐸(𝑥)𝑖 𝑚 〉 𝑖=1 . ⚫ Bob: Send |𝜓𝑦〉 = ∑ |𝑖〉|𝐸(𝑦)𝑖 𝑚 〉 𝑖=1 . ⚫ Referee: Apply Swap-Test on {|𝜓𝑥 〉,|𝜓𝑦 〉}. Output “𝑥 = 𝑦” if the test passed and output “𝑥 ≠ 𝑦” if the test didn’t pass. The analysis of the protocol is easy. If 𝑥 = 𝑦, then |𝜓𝑥 〉 = |𝜓𝑦〉 and the Swap-Test passes with probability 1. So Referee outputs |𝜓𝑥 〉 = |𝜓𝑦〉. If 𝑥 ≠ 𝑦, then not only |𝜓𝑥 〉 ≠ |𝜓𝑦〉 but they are far apart: |⟨𝜓𝑥 |𝜓𝑦 ⟩| = 1 𝑚 (|{𝑖: 𝐸(𝑥) 𝑖 = 𝐸(𝑦) 𝑖 }|) ≥ 1 − Ω(1) as guaranteed by the error correcting code. Thus the Swap-Test fails with Ω(1) probability. (As always, one can repeat the protocol to reduce the error probability exponentially.)

The communication cost is 0(log m)=0(logn) We have thus proved the following theorem. Theorem 5.The quantum communication complexity of EQn in the private-coin SMP model is o(logn). Note Communication complexity is a very interesting subject on its own and has numerous applications to other algorithm and complexity theories.For a general introduction,see [KN97]. Reference [KN97]Eyal Kushilevitz and Noam Nisan,Communication Complexity,Cambridge University Press,1997. Exercise 1. 2

The communication cost is O(log 𝑚) = 𝑂(log 𝑛). We have thus proved the following theorem. Theorem 5. The quantum communication complexity of 𝐄𝐐𝑛 in the private-coin SMP model is O(log 𝑛). Note Communication complexity is a very interesting subject on its own and has numerous applications to other algorithm and complexity theories. For a general introduction, see [KN97]. Reference [KN97] Eyal Kushilevitz and Noam Nisan, Communication Complexity, Cambridge University Press, 1997. Exercise 1. 2

点击下载完整版文档(DOCX)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
已到末页,全文结束
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有