Imperial College London > Talks@ee.imperial > ISN Seminar Series > Communications and Compression via Sparse Regression
Log inImperial users Other users No account?Information onFinding a talk Adding a talk Syndicating talks Who we are Everything else |
Communications and Compression via Sparse RegressionAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Patrick Kelly. Designing low-complexity, rate-optimal codes for communication and compression has long been one of the important goals of information and coding theory. In the last two decades, significant progress has been made towards this goal with the development of turbo, LDPC , and polar codes. However, these codes are all designed for sources and channels with finite, discrete alphabet. In this talk, we discuss a new class of codes for compression of Gaussian sources, and communication over Gaussian channels. These codes are designed using the framework of sparse linear regression, and have a structure that enables low-complexity encoding and decoding algorithms. We provide provable performance guarantees for these codes; in particular, we show that for Gaussian sources and channels, sparse regression codes with rates close to the optimal information-theoretic limits have probability of error that decays rapidly with block length. Finally, we describe how to construct sparse regression codes for multi-terminal communication and show that they attain the optimal rates in a variety of multi-terminal settings including multiple-access, broadcast, and compression with side-information. Joint work with Antony Joseph, Tuhin Sarkar, and Sekhar Tatikonda. Ramji Venkataramanan is a University Lecturer in the department of Engineering at the University of Cambridge, where he is also a Fellow of Trinity Hall. He received his Ph.D in EE (Systems) from the University of Michigan, Ann Arbor in 2008, and his undergraduate degree from the Indian Institute of Technology, Madras in 2002. Before joining Cambridge University in 2013, he held post-doctoral positions at Stanford University and Yale University. His research interests are broadly in communications and information processing for networks, and include network information theory, coding, statistical inference and learning. This talk is part of the ISN Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsType the title of a new list here Type the title of a new list here Experimental Solid State Group Seminars (Dept. of Physics)Other talksSignal Processing on a Riemannian Manifold |