The complexity of sampling from a weak quantum computer

Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019

Bibliographic Details
Main Author: Mehraban, Saeed.
Other Authors: Scott J. Aaronson and Aram W. Harrow.
Format: Thesis
Language:eng
Published: Massachusetts Institute of Technology 2020
Subjects:
Online Access:https://hdl.handle.net/1721.1/124065
_version_ 1826207602940313600
author Mehraban, Saeed.
author2 Scott J. Aaronson and Aram W. Harrow.
author_facet Scott J. Aaronson and Aram W. Harrow.
Mehraban, Saeed.
author_sort Mehraban, Saeed.
collection MIT
description Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019
first_indexed 2024-09-23T13:52:07Z
format Thesis
id mit-1721.1/124065
institution Massachusetts Institute of Technology
language eng
last_indexed 2024-09-23T13:52:07Z
publishDate 2020
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1240652020-03-10T03:13:57Z The complexity of sampling from a weak quantum computer Mehraban, Saeed. Scott J. Aaronson and Aram W. Harrow. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Electrical Engineering and Computer Science. Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019 Cataloged from PDF version of thesis. "Some pages in the original document contain text that runs off the edge of the page. p. 114, 134, 141"--Disclaimer Notice page. Includes bibliographical references (pages 203-211). Quantum computers have the promise of revolutionizing information technology, artificial intelligence, cryptography, and industries such as drug design. Up until this point our main understanding of quantum computing has been bound to theoretical speculations which has successfully lead to the design of algorithms and communication protocols which would dramatically change information technology but work well only assuming if we can experimentally build a reliable and large quantum computer which at the moment seems to be beyond the bounds of possibility. As of August 30, 2019, it appears that we are about to enter a new era in quantum technology. Over the next few years, intermediate-scale quantum computers with ~ 50-100 qubits are expected to become practical. These computers are too small to do error correction and they may not even capture the full power of a universal quantum computer. This is a significant step and inevitably we encounter the question "what can we do with this new technology?". A major theoretical challenge is to understand the capabilities and limitations of these devices. In order to approach this challenge, quantum supremacy experiments have been proposed as a near-term milestone. The objective of quantum supremacy is to find computational problems that are feasible on a small-scale quantum computer but are hard to simulate classically. Even though a quantum supremacy experiment may not have practical applications, achieving it will demonstrate that quantum computers can significantly outperform classical ones on, at least, artificially designed computational problems. Among other proposals, over the recent years, two sampling-based quantum supremacy experiments have been proposed: (1) The first proposal, known as Boson Sampling (Aaronson Arkhipov '10), is based on linear optical experiments. A baseline conjecture of Boson Sampling is that it is #P-hard to approximate the permanent of a Gaussian matrix with zero mean and unit variance with high probability. This is known as the permanent of Gaussians conjecture. (2) The second one is based on sampling from the output of a random circuit applied to a square grid of qubits. The Google quantum AI group is planning to implement this task on a processor composed of a few (~ 50-100) superconducting qubits. In order to argue that this sampling task is hard, building on previous works of Aaronson, Arkhipov, and others, they conjectured that the output distribution of a low-depth circuit, i.e., depth scaling asymptotically as the square root of the number of qubits, is anti-concentrated meaning that it has nearly maximal entropy. This is known as the "anti-concentration" conjecture. The first part of this thesis makes progress towards the permanent of Gaussians conjecture and shows that the permanent of Gaussian matrices can indeed be approximated in quasi-polynomial time with high probability if instead of zero mean one considers a nonzero but vanishing mean (~ 1/ poly log log in the size of the matrix). This result finds, to the best of our knowledge, the first example of a natural counting problem that is #P-hard to compute exactly on average and #P-hard to approximate in the worst case but becomes easy only when approximation and average case are combined. This result is based on joint work with Lior Eldar. The second part of this thesis proves the anti-concentration conjecture for random quantum circuits. The proof is an immediate consequence of our main result which settles a conjecture of Brandão-Harrow-Horodecki '12 that short-depth random circuits are pseudorandom. These pseudo-random quantum processes have many applications in algorithms, communication, cryptography as well as theoretical physics e.g. the so-called black hole information problem. This result is based on joint work with Aram Harrow. We furthermore study the speed of scrambling using random quantum circuits with different geometrical connectivity of qubits. We show that this speed crucially depends on the way we define scrambling and on the details of connectivity between qubits. In particular, we show that entanglement and the so-called out of time-ordered correlators are inequivalent measures of scrambling if the gates of the random circuit are applied between nearest-neighbor qubits on a graph with a tight bottleneck, e.g., a binary tree. A major implication of this result is that the scrambling speed of black holes may depend critically on the way one defines scrambling. Previously, it was believed that black holes are the fastest scramblers of quantum information. This result is based on joint work with Aram Harrow, Linghang Kong, Zi-Wen Liu, and Peter Shor. by Saeed Mehraban. Ph. D. Ph.D. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science 2020-03-09T18:51:49Z 2020-03-09T18:51:49Z 2019 2019 Thesis https://hdl.handle.net/1721.1/124065 1142179913 eng MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582 211 pages application/pdf Massachusetts Institute of Technology
spellingShingle Electrical Engineering and Computer Science.
Mehraban, Saeed.
The complexity of sampling from a weak quantum computer
title The complexity of sampling from a weak quantum computer
title_full The complexity of sampling from a weak quantum computer
title_fullStr The complexity of sampling from a weak quantum computer
title_full_unstemmed The complexity of sampling from a weak quantum computer
title_short The complexity of sampling from a weak quantum computer
title_sort complexity of sampling from a weak quantum computer
topic Electrical Engineering and Computer Science.
url https://hdl.handle.net/1721.1/124065
work_keys_str_mv AT mehrabansaeed thecomplexityofsamplingfromaweakquantumcomputer
AT mehrabansaeed complexityofsamplingfromaweakquantumcomputer