当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

电子科技大学:《贝叶斯学习与随机矩阵及在无线通信中的应用 BI-RM-AWC》课程教学资源(课件讲稿)09 Sparse Signal Recovery

资源类别:文库,文档格式:PDF,文档页数:91,文件大小:3.27MB,团购合买
3.1 Sparsity: Applications and Development 3.2 Sparsity Rendering Algorithms 3.3 EM 3.4 Variational Bayes 3.5 Sparse Signal Recovery: Performance PK 3.6 Other Applications for Bayes methods
点击下载完整版文档(PDF)

Chapter 3 Sparse Signal Recovery

Chapter 3 Sparse Signal Recovery

3.1 Sparsity:Applications and Development What is sparse? 1.Many data mining tasks can be represented using a vector or a matrix. 2.Sparsity implies many zeros in a vector or a matrix

3.1 Sparsity: Applications and Development What is sparse? 1. Many data mining tasks can be represented using a vector or a matrix. 2. Sparsity implies many zeros in a vector or a matrix

3.1 Sparsity:Applications and Development As seen in the last chapter in linear regression,we are actually solving this problem: y Φ w y(x,w)=wrφ(x)+n Where n is noise. 十 We have learned that,if p >n,there will be serious over-fitting. n×1 nxp pxI n×1 To suppress over-fitting,we can add a p>>n regularizer.We can add a sparse regularizer (LASSO)to render the target vector sparse,to select a small number of basis function

3.1 Sparsity: Applications and Development As seen in the last chapter in linear regression, we are actually solving this problem: Where is noise. We have learned that, if , there will be serious over-fitting. To suppress over-fitting, we can add a regularizer. We can add a sparse regularizer (LASSO) to render the target vector sparse, to select a small number of basis function. 𝒘 𝒏

3.1 Sparsity:Applications and Development In image processing,to compress a image,we first do a transformation to the pixel matrix to render it sparse,such transformations are 1 Singular Value Decomposition 2 Discrete Cosine Transform 3 Wavelet Transform... Note that the black pixels indicate the matrix values are close to zero ft corner thus making the matrix easy to compress 2 layer discrete cosine transfornpmed in with Haar wavelet basis

3.1 Sparsity: Applications and Development In image processing, to compress a image, we first do a transformation to the pixel matrix to render it sparse, such transformations are 1 Singular Value Decomposition 2 Discrete Cosine Transform 3 Wavelet Transform… Note that the black pixels indicate the matrix values are close to zero thus making the matrix easy to compress 2 layer discrete cosine transform with Haar wavelet basis U S V SVD Up left corner zoomed in DCT Original

3.1 Sparsity:Applications and Development Some times,on Weibo,interesting news originate from certain users and is forwarded many times by other users.We now know who forwards the messages and when the messages are forwarded. Now we want to construct a relationship (who friended whose Weibo) network from the above information.This can be abstracted as a topological graph. Sparsity:each node is linked to a small number of neighbors. Equivalent matrix representation

3.1 Sparsity: Applications and Development Some times, on Weibo, interesting news originate from certain users and is forwarded many times by other users. We now know who forwards the messages and when the messages are forwarded. Now we want to construct a relationship (who friended whose Weibo) network from the above information. This can be abstracted as a topological graph. Sparsity: each node is linked to a small number of neighbors

3.1 Sparsity:Applications and Development Collaborative filtering: Items ? ? ? ? ? ? ? ? ? ? ? ? ? Customers ? ? ? ? ? ? ? Customers are asked to rank items ? ? ? ? ? ? ? ? ? ? ? ? ? ? Not all customers ranked all items ? ? ? ? ? Predict the missing rankings ? ? ? ? ? ?

3.1 Sparsity: Applications and Development Collaborative filtering: Customers are asked to rank items Not all customers ranked all items Predict the missing rankings

3.1 Sparsity:Applications and Development Movies The Netflix prize: ? ? ? ? ? ? ? ? ? ? ? ? ? ? Users ? ? ? ? ? ? ? ? ? ? ? ? About a million users and ? ? ? ? ? ? ? ? ? ? 25000 movies ? Known rankings are sparsely distributed Predict unknown ratings

3.1 Sparsity: Applications and Development The Netflix prize: About a million users and 25000 movies Known rankings are sparsely distributed Predict unknown ratings

3.1 Sparsity:Applications and Development In 2006,monumental papers of compressive sensing were published: Emmanuel Candes,Justin Romberg,and Terence Tao,Robust uncertainty principles:Exact signal reconstruction from highly incomplete frequency information.(IEEE Trans.on Information Theory,52(2)pp.489-509,February 2006) David Donoho,Compressed sensing.(IEEE Trans.on Information Theory,52(4), pp.1289-1306,April2006) Emmanuel Candes and Terence Tao,Near optimal signal recovery from random projections:Universal encoding strategies?(IEEE Trans.on Donoho返a时etet目ghdeshao prize Information Theory,52(12),pp.5406-5425,December 2006)

3.1 Sparsity: Applications and Development In 2006, monumental papers of compressive sensing were published: Emmanuel Candès, Justin Romberg, and Terence Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. (IEEE Trans. on Information Theory, 52(2) pp. 489 - 509, February 2006) David Donoho, Compressed sensing. (IEEE Trans. on Information Theory, 52(4), pp. 1289 - 1306, April 2006) Emmanuel Candès and Terence Tao, Near optimal signal recovery from random projections: Universal encoding strategies? (IEEE Trans. on Information Theory, 52(12), pp. 5406 - 5425, December 2006) Donoho was awarded the Shao prize Emmanuel Candès Terrace Tao

3.2 Sparsity Rendering Algorithms The very important problem in compressive sensing is solving this problem: Given a sparse s,and do this compressiony=Φw,Φis a underdetermined matrix.Now the target is from y,to recover s Bad news is:is underdetermined and we know, normally,y =w has infinite solutions. Good news is:we have a prior information:w is sparse

3.2 Sparsity Rendering Algorithms The very important problem in compressive sensing is solving this problem: Given a sparse , and do this compression , is a underdetermined matrix. Now the target is from , to recover Bad news is: is underdetermined and we know, normally, has infinite solutions. Good news is: we have a prior information: is sparse 𝑦 𝛷 𝑤

3.2 Sparsity Rendering Algorithms Here are two concerns: 1:How sparse should w be so that it can be accurately recovered. 2:Is there any requisition fordΦ? For question 1,we know that y =w has infinite solutions,thus,we have to attach some conditions to s to this solution unique. As s is sparse,we should make it the sparsest solution for y =w. For question 2,we have the following lemma. Suppose a m x n matrix is such that every set of 2S columns are of are linearly independent.Then an S-sparse (the vector w has s non- zero elements)vector w can be reconstructed uniquely from y =w

3.2 Sparsity Rendering Algorithms Here are two concerns: 1: How sparse should be so that it can be accurately recovered. 2: Is there any requisition for ? For question 1, we know that has infinite solutions, thus, we have to attach some conditions to to this solution unique. As is sparse, we should make it the sparsest solution for . For question 2, we have the following lemma. Suppose a matrix is such that every set of 2S columns are of are linearly independent. Then an S-sparse (the vector has S non￾zero elements) vector can be reconstructed uniquely from

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
共91页,可试读20页,点击继续阅读 ↓↓
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有