Boaz Barak and Tselil Schramm
Fridays 1:15pm-4:30pm, Room 301 in Pierce Hall (Pizza included, replacing the theory reading group fall 2018)
See this blog post of Tselil Schramm for links to all the blog posts of presentations in this seminar!
Notes:
If you are considering attending this seminar, please sign up for its Piazza forum. Non Harvard participants: Since the forum does not allow self sign up with non-Harvard emails, please fill out this form and I will add you.
While in the course catalog this seminar is officially listed as taking place from 2:30 till 3:45, it will actually take place from 1:30-4:30, with Pizza from 1:15 till 1:30.
We will start the course at Harvard, but if a sufficient number of MIT students enroll (and we can find an appropriate room at MIT), we might rotate it between MIT and Harvard. MIT students: See this page (see also here) for information how to cross-register for Harvard courses.
(See also this blog post as well as other related posts) and the blog post roundup.
In this graduate seminar we will explore some of the connections between theoretical computer science and physics. Some topics include:
Analyzing statistical-physics inspired algorithms such as belief propagation, understanding the physics predictions for hard and easy regimes via phase transitions. Connections to Monte Carlo Markov Chains.
Quantum information theory: quantum-inspired classical results, as well as classical algorithms for quantum problems.
The “conformal bootstrap”: exploring the space of possible physical theories using semidefinite programming.
Black holes, bulk/boundary correspondence, and computational complexity.
“Quantum superiority” - understanding the current proposals for demonstrating exponential speedups for quantum computers, and the evidence for their classical difficulty.
Quantum Hamiltonian Complexity - the quantum analog of constraint satisfaction problems, with questions such as the existence of a “Quantum PCP Theorem”.
All of these are topics that I personally find fascinating, but know very little about. I hope we can learn about them together. Each one of those can probably be the topic of a full semester-long course (and even that, assuming significant physics background). I hope this seminar will be like a “tasting menu” where we pick some of the juiciest and most interesting questions and results from all these areas, and attempt to understand and present them using the minimal amount of physics.
In this seminar, each team of students will pick one topic, which could be a single paper, 2-3 related works, or a survey, and present it in one or two three-hour talks. The team will also write a blog post about the topic. In my experience, physicists and mathematicians/theoretical computer scientists tend often to study the same objects under different names. Thus one of the important parts of each such blog post would be a “dictionary” that would help theoretical computer scientists parse the physics literature, by mapping the physics notation to the names that are more familiar to TCS researchers.
Each team should send me a draft blog post two weeks before their presentation, and then schedule a time to meet with me and go over their talk together. The course is aimed at graduate students in theoretical computer science, but advanced undegraduates might also enjoy it (I recommend undergraduate students take it if they already took a graduate theory course at Harvard or MIT).
Some more information about the topics: (Based on my current very partial and likely incorrect understanding of them: hopefully will know more by the term ends!)
Statistical physics is concerned with systems which have a large number of components, such as molecules inside a container. The key insight is that even though it is very complicated to understand the complete behavior of all particles such system, we can still understand qualitative features of it, such as the temperature when it undergoes a phase transition from liquid to gas, without modeling each component separately. Statistical physicists are also interested in the dynamics of such systems. In particular, while sometimes the state of the system is just a function of its temperature, sometimes it is a function of the history in which it arrived to this temperature. (For example, a glass is obtained by heating sand to high temperature and then cooling it quickly, so that instead of getting back to being sand, it is “stuck” at the “local optimum” of a glassy state.)
Often in statistical physics we have a system of n particles, where neighboring particles exert some forces on one another. The solution to the configuration obtained by combining these forces can often be obtained as the minimizer of some energy function. We can think of this function as summing up violations over local constraints. (E.g., if particle i wants to have a different spin from its neighbors, then that would be a constraint). We can see that this is simply a constraint satisfaction problem. We also have a temperature parameter, which we can think of as trying to maximize the entropy of the system. Interestingly, the calculations of statistical physics are often the same ones as those of Bayesian statistics: the probability that an event happens is the product of its a priori probability (which corresponds to the entropy term) times the likelihood of our observations given this vent (which will correspond to a quantity that decreases exponentially in the violated constraints).
Statistical physicists are often interested in certain local search algorithms that gradually update their state. They both use these algorithms to do calculations, as well as want to analyze their behvaior as a way to get insight into the system. In particular, physicists have by now developed deep intuitions into using these algorithms to predict certain phase transitions of random constraint satisfaction problems, and how they evolve from satisfiable to unsatisfiable, as well as from easy to hard, as the parameters change.
Surveys/papers in this area:
Statistical physics of inference: Thresholds and algorithms / Lenka Zdeborová, Florent Krzakala
The Sherrington-Kirkpatrick model: an overview / Dmitry Panchenko
Message Passing Algorithms for Compressed Sensing / David L. Donoho, Arian Maleki, Andrea Montanari
How to calculate partition functions using convex programming hierarchies: provable bounds for variational methods / Andrej Risteski and Approximate maximum entropy principles via Goemans-Williamson with applications to provable variational methods / Yuanzhi Li, Andrej Risteski
Counting Independent Sets up to the Tree Threshold / Dror Weitz, see also Dror’s thesis and the paper Computational Transition at the Uniqueness Threshold / Sly.
As we’ve seen, constraint satisfaction problems are closely related to physical systems. In the quantum setting, the “energy function” we discussed above becomes a Hamiltonian (which is simply a positive semidefinite matrix, corresponds to the fact that in statistical physics the energy function is often a non-negative polynomial such as a sum of squares), and the configuration minimizing the energy corresponds to the minimal eigenvalue of this Hamiltonian, often known as a ground state. This Hamiltonian is often exponential in size, but has a small polynomial description since it is a sum of local Hamlitonians (ones that only depend on a few neighboring particles/inputs) just like a constraint satisfaction problem is a sum of local constraints. Finding or approximating the ground state of a Hamiltonian is an important problem in physics, and understanding its complexity in both the quantum and classical setting is of great interest.
Surveys/papers on this area:
Quantum mechanics can be thought of as an extension of classical probability theory, and quantum information theory is the extension of information theory to this realm. Like quantum mechanics itself, some intuitions carry over from probability, while some aspects change in a subtle and interesting ways. It contains some beautiful math that is interesting in its own right, even if you don’t care about quantum computing or quantum mechanics. For example, entanglement is analogous but not identical to probabilistic correlations. One way in which entanglement differs is known as “monogamy of entanglement”, where a qubit can be maximally entangled with at most one other qubit. This property in turn is used in various “area laws” that help in simulating and modeling quantum systems. The relation between classical and quantum information theory is loosely analogous to the relation between linear programming and semidefinite programming, and some such relations have been made more precise and useful in several works.
The central object in quantum information theory is the density matrix which captures all the information that could be theoretically measured about a quantum system. A density matrix is a positive semidefinite matrix of trace one. A classical probability distribution corresponds to a diagonal density matrix (in which all diagonal entries are non-negative and sum up to one, and hence are just probabilities). The crucial difference between diagonal and non diagonal PSD matrices is that the latter do not necessarily commute.
There are quantum-inspired results in classical questions, as well as classical algorithms for solving questions related to quantum information theory such as the best separable state problem, where we are given a quantum measurement and want to find the separable (i.e., non entangled) state that maximizes the probability of passing the measurement. Algorithms for this problem can be used to certify that certain systems (such as purported quantum computers) actually have entanglement.
Surveys/papers on this area:
Quantum Shannon Theory / John Preskill - lecture notes.
The Theory of Quantum Information / Watrus - textbook, available online and on Amazon
Quantum Proofs for Classical Theorems/ Andrew Drucker , Ronald de Wolf
(This is one of the parts I understand the least.) Physics has two very successful theories: general relativity and quantum mechanics. Unfortunately they are mathematically inconsistent with one another. In my limited understanding, the main issue is that quantum mechanics does not account for gratvity, in particular because it is based on unitary transformations and hence has the property that if we “scale up” or “scale down” a system by a certain factor then it behaves identically. In particular, this problem arises with black holes, where according to general relativity, they destroy information, but that would be a violation of quantum mechanics where all operations are modeled by a unitary transformation and hence are reversible. Interestingly, one of the possible ways physicists offered to resolve this contradiction was that while black holes do not destroy information, they “scramble” it in a way that would be computationally infeasible to reverse.
My understanding is that some approaches to resolve the tension between the two theories involve the idea that the amount of information a physical system can store is not proportional to its volume but rather its surface area, which is known as the “holographic principle”. The idea would be that if you have a d+1 dimensional physical system, then it can be described by understanding its boundary, which is a d dimensional object. Things are cleanest if we make the (physically wrong but mathematically clean) assumption that the universe is contracting (also known as “anti de-Sitter space” or “ADS”) which means that it does have a well-defined boundary. Apparently they do have suggestions for nice “conformal field theory” on the boundary, but one of the suggestions is that the notion of time or distance in the bulk corresponds to complexity on the boundary.
Surveys/papers on this area:
Jerusalem Lectures on Black Holes and Quantum Information / Daniel Harlow
TASI Lectures on the Emergence of the Bulk in AdS/CFT / Daniel Harlow
The Second Law of Quantum Complexity / Adam R. Brown, Leonard Susskind
John Preskill’s Tutorial on Quantum Information and gravity , video of part 1 , video of part 2
Quantum chaos and complexity:
Chaos and complexity by design / Daniel A. Roberts, Beni Yoshida
Chaos in quantum channels / Pavan Hosur, Xiao-Liang Qi, Daniel A. Roberts, Beni Yoshida
Searching for a physical theory that explains current observations and is mathematically “beautiful” or “simple” can itself be seen as an optimization problems. Recent works have been aimed at using semidefinite programming to search through “theoryspace” by deriving constraints from certain symmetry properties. This is known as the “conformal bootstrap”. I personaly don’t know much about this, but would be interested in finding out more.
Surveys/papers on this area:
Much of current progress on building quantum computers is focused on “noisy intermediate quantum computers” of 50-100 qubits or so, that are not error corrected and hence subject to a certain amount of noise. These computers will not be powerful enough to run Shor’s algorithms, but there might still exist a task that these computers can perform, and for which we believe classical computers would require an exponential amount of computation. In particular, people have looked at the task of taking a random quantum circuit C as input, and outputting the output of C on (say) the all-zeroes string. This is a distribution that can be approximated by a noisy intermediate quantum computer, but might be hard to sample for classical computers. In fact, it has been shown that sampling from similar distributions exactly is hard unless the polynomial hierarchy collapses, since the probabilities involved are the permanents of a matrix related to the circuit C. But understand the difficulty of approximate sampling is still an open problem.
There are other questions where the power of classical algorithms is still not fully understood. For example, once the world transition to “post quantum cryptography” it could well be the case that the most complelling applicaitons of quantum computers would be for quantum chemistry. Some of these problems are quantum analogs of the kind of statistical physics / constraint satisfaction problems we mentioned above, and it could well be that some of them have non trivial classical algorithms as well.
Surveys/papers on this area: