Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits

Quantum computing is a quickly growing field with great potential for future technology. Quantum computers in the current noisy intermediate-scale quantum (NISQ) era face two major limitations:1) qubit count and 2) error vulnerability. Although quantum error correction methods exist, they are not ap...

Full description

Bibliographic Details
Main Authors: Travis LeCompte, Fang Qi, Xu Yuan, Nian-Feng Tzeng, M. Hassan Najafi, Lu Peng
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Transactions on Quantum Engineering
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10209261/
_version_ 1797322592788414464
author Travis LeCompte
Fang Qi
Xu Yuan
Nian-Feng Tzeng
M. Hassan Najafi
Lu Peng
author_facet Travis LeCompte
Fang Qi
Xu Yuan
Nian-Feng Tzeng
M. Hassan Najafi
Lu Peng
author_sort Travis LeCompte
collection DOAJ
description Quantum computing is a quickly growing field with great potential for future technology. Quantum computers in the current noisy intermediate-scale quantum (NISQ) era face two major limitations:1) qubit count and 2) error vulnerability. Although quantum error correction methods exist, they are not applicable to the current size of computers, requiring thousands of qubits, while current NISQ systems have hundreds at most. It is, therefore, imperative to improve the reliability of the circuits as much as possible to make them robust to the errors that will occur. One common approach is to adjust the compilation process of a circuit to create a final circuit with improved reliability. However, there are many decisions to be made when compiling that affect the final performance of the circuit, two of the most critical ones being the mapping of logical to physical qubits (the qubit allocation problem) and the movement of qubits to satisfy two-qubit gate adjacency requirements (the qubit routing problem). We focus on solving the qubit allocation problem and identifying initial layouts that reduce error. To identify these layouts, we combine reinforcement learning with a graph neural network (GNN)-based Q-network for analyzing both the connections and error rates of the graphlike backend of superconducting quantum computers to make mapping decisions, creating a GNN-assisted compilation (GNAQC) strategy. We provide both the circuit and the properties of the target backend as input to guide the decision-making process. We work with the IBM Qiskit applications programming interface to compile and simulate our quantum circuits. We train the architecture using a set of four backends and six circuits and find that GNAQC generally outperforms preexisting qubit allocation algorithms, increasing final relative output fidelity by roughly 12.7%.
first_indexed 2024-03-08T05:16:33Z
format Article
id doaj.art-e7a43d8b9b6a4e778fd6c47bac819828
institution Directory Open Access Journal
issn 2689-1808
language English
last_indexed 2024-03-08T05:16:33Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Transactions on Quantum Engineering
spelling doaj.art-e7a43d8b9b6a4e778fd6c47bac8198282024-02-07T00:02:15ZengIEEEIEEE Transactions on Quantum Engineering2689-18082023-01-01411410.1109/TQE.2023.330189910209261Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum CircuitsTravis LeCompte0https://orcid.org/0000-0002-6915-3545Fang Qi1Xu Yuan2https://orcid.org/0000-0003-3775-3033Nian-Feng Tzeng3M. Hassan Najafi4https://orcid.org/0000-0002-4655-6229Lu Peng5https://orcid.org/0000-0003-3545-286XLouisiana State University, Baton Rouge, LA, USATulane University, New Orleans, LA, USAUniversity of Delaware, Newark, DE, USAUniversity of Louisiana at Lafayette, Lafayette, LA, USAUniversity of Louisiana at Lafayette, Lafayette, LA, USATulane University, New Orleans, LA, USAQuantum computing is a quickly growing field with great potential for future technology. Quantum computers in the current noisy intermediate-scale quantum (NISQ) era face two major limitations:1) qubit count and 2) error vulnerability. Although quantum error correction methods exist, they are not applicable to the current size of computers, requiring thousands of qubits, while current NISQ systems have hundreds at most. It is, therefore, imperative to improve the reliability of the circuits as much as possible to make them robust to the errors that will occur. One common approach is to adjust the compilation process of a circuit to create a final circuit with improved reliability. However, there are many decisions to be made when compiling that affect the final performance of the circuit, two of the most critical ones being the mapping of logical to physical qubits (the qubit allocation problem) and the movement of qubits to satisfy two-qubit gate adjacency requirements (the qubit routing problem). We focus on solving the qubit allocation problem and identifying initial layouts that reduce error. To identify these layouts, we combine reinforcement learning with a graph neural network (GNN)-based Q-network for analyzing both the connections and error rates of the graphlike backend of superconducting quantum computers to make mapping decisions, creating a GNN-assisted compilation (GNAQC) strategy. We provide both the circuit and the properties of the target backend as input to guide the decision-making process. We work with the IBM Qiskit applications programming interface to compile and simulate our quantum circuits. We train the architecture using a set of four backends and six circuits and find that GNAQC generally outperforms preexisting qubit allocation algorithms, increasing final relative output fidelity by roughly 12.7%.https://ieeexplore.ieee.org/document/10209261/Fidelitygraph neural networks (GNNs)quantum compilationqubit allocation
spellingShingle Travis LeCompte
Fang Qi
Xu Yuan
Nian-Feng Tzeng
M. Hassan Najafi
Lu Peng
Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
IEEE Transactions on Quantum Engineering
Fidelity
graph neural networks (GNNs)
quantum compilation
qubit allocation
title Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
title_full Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
title_fullStr Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
title_full_unstemmed Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
title_short Machine-Learning-Based Qubit Allocation for Error Reduction in Quantum Circuits
title_sort machine learning based qubit allocation for error reduction in quantum circuits
topic Fidelity
graph neural networks (GNNs)
quantum compilation
qubit allocation
url https://ieeexplore.ieee.org/document/10209261/
work_keys_str_mv AT travislecompte machinelearningbasedqubitallocationforerrorreductioninquantumcircuits
AT fangqi machinelearningbasedqubitallocationforerrorreductioninquantumcircuits
AT xuyuan machinelearningbasedqubitallocationforerrorreductioninquantumcircuits
AT nianfengtzeng machinelearningbasedqubitallocationforerrorreductioninquantumcircuits
AT mhassannajafi machinelearningbasedqubitallocationforerrorreductioninquantumcircuits
AT lupeng machinelearningbasedqubitallocationforerrorreductioninquantumcircuits