Variational Monte Carlo with large patched transformers

Abstract Large language models, like transformers, have recently demonstrated immense powers in text and image generation. This success is driven by the ability to capture long-range correlations between elements in a sequence. The same feature makes the transformer a powerful wavefunction ansatz th...

Full description

Bibliographic Details
Main Authors: Kyle Sprague, Stefanie Czischek
Format: Article
Language:English
Published: Nature Portfolio 2024-03-01
Series:Communications Physics
Online Access:https://doi.org/10.1038/s42005-024-01584-y