Gauging tensor networks with belief propagation

Effectively compressing and optimizing tensor networks requires reliable methods for fixing the latent degrees of freedom of the tensors, known as the gauge. Here we introduce a new algorithm for gauging tensor networks using belief propagation, a method that was originally formulated for performing...

Full description

Bibliographic Details
Main Author: Joseph Tindall, Matt Fishman
Format: Article
Language:English
Published: SciPost 2023-12-01
Series:SciPost Physics
Online Access:https://scipost.org/SciPostPhys.15.6.222
_version_ 1797435137297743872
author Joseph Tindall, Matt Fishman
author_facet Joseph Tindall, Matt Fishman
author_sort Joseph Tindall, Matt Fishman
collection DOAJ
description Effectively compressing and optimizing tensor networks requires reliable methods for fixing the latent degrees of freedom of the tensors, known as the gauge. Here we introduce a new algorithm for gauging tensor networks using belief propagation, a method that was originally formulated for performing statistical inference on graphical models and has recently found applications in tensor network algorithms. We show that this method is closely related to known tensor network gauging methods. It has the practical advantage, however, that existing belief propagation implementations can be repurposed for tensor network gauging, and that belief propagation is a very simple algorithm based on just tensor contractions so it can be easier to implement, optimize, and generalize. We present numerical evidence and scaling arguments that this algorithm is faster than existing gauging algorithms, demonstrating its usage on structured, unstructured, and infinite tensor networks. Additionally, we apply this method to improve the accuracy of the widely used simple update gate evolution algorithm.
first_indexed 2024-03-09T10:42:53Z
format Article
id doaj.art-c2b6bbf52ee4405ca017d991679b9c45
institution Directory Open Access Journal
issn 2542-4653
language English
last_indexed 2024-03-09T10:42:53Z
publishDate 2023-12-01
publisher SciPost
record_format Article
series SciPost Physics
spelling doaj.art-c2b6bbf52ee4405ca017d991679b9c452023-12-01T12:51:24ZengSciPostSciPost Physics2542-46532023-12-0115622210.21468/SciPostPhys.15.6.222Gauging tensor networks with belief propagationJoseph Tindall, Matt FishmanEffectively compressing and optimizing tensor networks requires reliable methods for fixing the latent degrees of freedom of the tensors, known as the gauge. Here we introduce a new algorithm for gauging tensor networks using belief propagation, a method that was originally formulated for performing statistical inference on graphical models and has recently found applications in tensor network algorithms. We show that this method is closely related to known tensor network gauging methods. It has the practical advantage, however, that existing belief propagation implementations can be repurposed for tensor network gauging, and that belief propagation is a very simple algorithm based on just tensor contractions so it can be easier to implement, optimize, and generalize. We present numerical evidence and scaling arguments that this algorithm is faster than existing gauging algorithms, demonstrating its usage on structured, unstructured, and infinite tensor networks. Additionally, we apply this method to improve the accuracy of the widely used simple update gate evolution algorithm.https://scipost.org/SciPostPhys.15.6.222
spellingShingle Joseph Tindall, Matt Fishman
Gauging tensor networks with belief propagation
SciPost Physics
title Gauging tensor networks with belief propagation
title_full Gauging tensor networks with belief propagation
title_fullStr Gauging tensor networks with belief propagation
title_full_unstemmed Gauging tensor networks with belief propagation
title_short Gauging tensor networks with belief propagation
title_sort gauging tensor networks with belief propagation
url https://scipost.org/SciPostPhys.15.6.222
work_keys_str_mv AT josephtindallmattfishman gaugingtensornetworkswithbeliefpropagation