Plenary Speakers

Shannon Lecture: Charles H. Bennett

Charles H. Bennett
Shannon Lecture

Quantum Information's Birth, Growth, and Significance

Wednesday, 24 June, 09:00 - 10:00 (PDT)

Abstract

The information revolution is based on what a physicist would call a classical view of information. Quantum effects, though long known, were regarded mainly as another noise source to be managed by classical error correction. But only twenty years after Shannon's landmark paper, Wiesner noticed that they could be used to do some intriguingly nonclassical things, such as making impossible-to-counterfeit banknotes or multiplexing two messages into an single optical transmission from which the receiver could receive either one at will but not both. After a slow start, quantum information has developed into the most natural foundation for the mathematical theory of communication, extending Shannon’s theory as Einstein’s extends Newton’s. We review quantum information theory, especially the uniquely strong and private kind of correlation known as entanglement. Aside from enabling new kinds of computation and communication, entanglement helps explain the origin of randomness, why the future is less certain than the past, and, paradoxically, the macroscopic world's superficially classical appearance, which allowed quantum laws to remain undiscovered until the 20th century.

Biography

Charles H. Bennett was born in 1943, the son of music teachers. He received his PhD from Harvard in 1971 under David Turnbull and did a postdoc at Argonne Laboratory under Aneesur Rahman. Since coming to IBM Research in 1972, he has worked on various aspects of the relation between physics and information. In 1973, building on the work of IBM's Rolf Landauer, he showed that universal computation can be performed by a logically and thermodynamically reversible apparatus, which can operate with arbitrarily little energy dissipation per step because it avoids throwing away information about past logical states. Based on this he proposed the currently accepted resolution of the Maxwell's demon paradox, attributing the demon’s inability violate the second law to the thermodynamic cost of information destruction rather than acquisition. This was not a new discovery but rather a reaffirmation of Smoluchowski’s correct 1914 analysis of the demon, which had been partly forgotten in the interim due to confusion over the different ways quantum mechanics and thermodynamics constrain measurement. In other early work Bennett introduced the complexity measure “logical depth”---the computation time needed to compute a digital object from a near-incompressible algorithmic description---and studied of the role of dissipation in improving the copying of genetic information and absolutely stabilizing states of locally-interacting systems that in the absence of dissipation would be merely metastable.

In 1984, Bennett and Gilles Brassard of the Université de Montréal, building on the seminal insights of Stephen Wiesner, developed a practical system of quantum cryptography, allowing secure communication between parties who share no secret information initially, and with the help of their students built a working demonstration of it in 1989. In 1993, in collaboration with Claude Crepeau, Richard Jozsa, Asher Peres, and William Wootters, they discovered "quantum teleportation," in which the complete information in a system is decomposed into a classical message and quantum entanglement, then reassembled from these ingredients in a new location to produce an exact replica of the original quantum state that was destroyed in the sending process. In subsequent years Bennett contributed to a comprehensive rebuilding of the theory of information processing on quantum foundations, including quantum error correction, the recognition of entanglement as an independent quantifiable resource, the multiple (e.g. classical, private, and quantum) capacities of quantum channels, and the “quantum reverse Shannon theorem” establishing the ability of any quantum or classical channel with nonzero classical capacity to efficiently simulate any other in the presence of a strong entanglement resource, or a combination of ordinary entanglement and classical back-communication.

With IBM colleagues DiVincenzo, Linsker, Smolin, and Donkor he devised “time bracketed authentication” a method for protecting audio/visual and other recordings from falsification, even by an untrusted recording apparatus, using low-bandwidth bidirectional communication between the process being recorded and an outside world trusted to be beyond the control of would-be falsifiers. Incoming signals establish a prior time bracket by unpredictably influencing the process being recorded, while outgoing signals, e.g. hashed digests of the ongoing recording, establish a posterior time bracket.

Recently he has become interested in the application of quantum information to cosmology, and characterizing the conditions (including thermodynamic disequilibrium) that lead to the emergence of classical correlations and computationally complex structures from quantum laws.

Bennett is an IBM Fellow, a Fellow of the American Physical Society, and a member of the U.S. National Academy of Sciences. He is a recipient of the Rank, Harvey, Okawa, Wolf, and Micius Quantum Prizes. He has served as a Divisional Associate Editor for Physical Review Letters, and as both Secretary and Chair of the National Academy of Sciences Class III (Engineering and Applied Physical Sciences).

PLEN-1: Max Welling

Max Welling

Neural Augmentation In Wireless Communication

Monday, 22 June, 09:30 - 10:30 (PDT)

Abstract

It is highly likely that machine learning will play a key role in the next generation technology standard for cellular networks: 5G+ML=6G. In this talk I will review a number of ways I have been involved in exploring and pushing the boundary of that technology, based on a principle which I call "Neural Augmentation" (NA). In NA we acknowledge that the "classical" solutions that have been developed in the signal processing community are an extremely strong baseline on which we should build. On the other hand, we know that deep learning is able to learn patterns that are difficult or impossible to detect by humans, if it has access to a sufficiently large dataset and when the domain is narrow enough so that the acquired data can cover it. We argue that by combining classical engineering solutions with deep learning we can learn from smaller datasets and generalize better to out-of-domain situations. In NA we train a neural network to iteratively correct the classical solution. These corrections are hopefully small, and therefore more easy to model. We apply this principle to three problems in wireless communication: error-correction decoding, MIMO demodulation and channel estimation. We find that neural networks are indeed able to improve the state of the art when combined with the classical methods.

Biography

Max Welling is a research chair in Machine Learning at the University of Amsterdam and a VP Technologies at Qualcomm. He has a secondary appointment as a senior fellow at the Canadian Institute for Advanced Research (CIFAR). He is co-founder of “Scyfer BV” a university spin-off in deep learning which got acquired by Qualcomm in summer 2017. In the past he held postdoctoral positions at Caltech (’98-’00), UCL (’00-’01) and the U. Toronto (’01-’03). He received his PhD in ’98 under supervision of Nobel laureate Prof. G. 't Hooft. Max Welling has served as associate editor in chief of IEEE TPAMI from 2011-2015 (impact factor 4.8). He serves on the board of the NIPS foundation since 2015 (the largest conference in machine learning) and has been program chair and general chair of NIPS in 2013 and 2014 respectively. He was also program chair of AISTATS in 2009 and ECCV in 2016 and general chair of MIDL 2018. He has served on the editorial boards of JMLR and JML and was an associate editor for Neurocomputing, JCGS and TPAMI. He received multiple grants from Google, Facebook, Yahoo, NSF, NIH, NWO and ONR-MURI among which an NSF career grant in 2005. He is recipient of the ECCV Koenderink Prize in 2010. Welling is co-founder and board member of the Innovation Center for AI (ICAI) and the European Lab for Learning and Intelligent Systems (ELLIS). He directs the Amsterdam Machine Learning Lab (AMLAB), and co-directs the Qualcomm-UvA deep learning lab (QUVA) and the Bosch-UvA Deep Learning lab (DELTA). He has over 300 scientific publications in machine learning, computer vision, statistics and physics and an h-index of 68.

PLEN-2: Venkat Anantharam

Venkat Anantharam

Gone Fishin'

Tuesday, 23 June, 09:00 - 10:00 (PDT)

Abstract

The Poisson process and point processes in general have much to bring to the toolbox of the information theorist, apart from arising naturally when building models in some information theoretic problems of physical interest. This talk will attempt to reel up some of the times point processes have bitten the hook of researchers in the river of information theory.

The aim is to highlight some striking results and some intriguing techniques in this area.

Biography

Venkat Anantharam is on the faculty of the Department of Electrical Engineering and Computer Sciences at University of California, Berkeley. He received his undergraduate degree from Indian Institute of Technology Madras and his graduate degrees from UC Berkeley. From 1986 to 1994 he was on the faculty of the School of EE at Cornell University, before moving to UC Berkeley. His research work includes contributions to communication networking, stochastic control, game theory and information theory.

PLEN-4: Olgica Milenkovic

Olgica Milenkovic

Coded String Reconstruction Problems

Thursday, 25 June, 09:00 - 10:00 (PDT)

Abstract

String reconstruction problems frequently arise in genomic data processing, molecular storage system implementations and synthetic biology. In the most general setting, the problems may be summarized as follows: one is given a single or multiple copies of a string. The copies are subsequently processed and transmitted through noise-inducing channels. The goal of the reconstruction method is to recover the original string or an approximation thereof using the noisy string information. Examples of string reconstruction questions include reconstruction from noisy traces, reconstruction from substrings and k-decks and reconstruction from substring composition sets. We review the above and related problems and then proceed to describe coding methods that lead to collections of strings that can be more accurately and efficiently reconstructed than their uncoded counterparts.

Biography

Olgica Milenkovic is a professor of Electrical and Computer Engineering at the University of Illinois, Urbana-Champaign (UIUC), and Research Professor at the Coordinated Science Laboratory. She obtained her Master’s Degree in Mathematics in 2001 and PhD in Electrical Engineering in 2002, both from the University of Michigan, Ann Arbor. Prof. Milenkovic is heading a group focused on addressing unique interdisciplinary research challenges spanning the areas of algorithm design and computing, bioinformatics, coding theory, machine learning and signal processing. Her scholarly contributions have been recognized by multiple awards, including the NSF Faculty Early Career Development (CAREER) Award, the DARPA Young Faculty Award, the Dean’s Excellence in Research Award, and several best paper awards. In 2013, she was named a Center for Advanced Study Associate and Willett Scholar while in 2015 she became a Distinguished Lecturer of the Information Theory Society. She is an EEE Fellow and has served as Associate Editor of the IEEE Transactions of Communications, the IEEE Transactions on Signal Processing, the IEEE Transactions on Information Theory and the IEEE Transactions on Molecular, Biological and Multi-Scale Communications. In 2009, she was the Guest Editor-in-Chief of a special issue of the IEEE Transactions on Information Theory on Molecular Biology and Neuroscience, while in 2019 she served as Guest Editor-in-Chief of a special dedicated to the interdisciplinary work of V.I. Levenshtein.

PLEN-5: Pablo Parrilo

Pablo Parrilo

Sum of Squares — Where are We, and Where to Go?

Friday, 26 June, 09:00 - 10:00 (PDT)

Abstract

Over the past two decades, semidefinite programming and sum of squares methods have provided state-of-the-art results — both theoretical and practical -- for a variety of problems in many areas, including combinatorial optimization, systems and control, and statistical estimation. In this talk we’ll provide a gentle introduction and survey of basic notions, algorithmic techniques, and future challenges. Particular emphasis will be given to newer developments, and applications to probability theory and quantum information.

Biography

Pablo A. Parrilo is the Joseph F. and Nancy P. Keithley Professor of Electrical Engineering and Computer Science at MIT, with a joint appointment in Mathematics. He is affiliated with the Laboratory for Information and Decision Systems (LIDS) and the Operations Research Center (ORC). Past appointments include Assistant Professor at the Automatic Control Laboratory of the Swiss Federal Institute of Technology (ETH Zurich), and Visiting Associate Professor at the California Institute of Technology. He received an Electronics Engineering undergraduate degree from the University of Buenos Aires, Argentina, and a PhD in Control and Dynamical Systems from the California Institute of Technology. His research interests include mathematical optimization, machine learning, control and identification, robustness analysis and synthesis, and the development and application of computational tools based on convex optimization and algorithmic algebra to practically relevant engineering problems. Prof. Parrilo has received several distinctions, including the Donald P. Eckman Award of the American Automatic Control Council, the SIAM Activity Group on Control and Systems Theory (SIAG/CST) Prize, the IEEE Antonio Ruberti Young Researcher Prize, and the Farkas Prize of the INFORMS Optimization Society. He is an IEEE and SIAM Fellow.

Plan Ahead

IEEE ISIT 2021

2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!