Variational Monte Carlo with large patched transformers
Abstract Large language models, like transformers, have recently demonstrated immense powers in text and image generation. This success is driven by the ability to capture long-range correlations between elements in a sequence. The same feature makes the transformer a powerful wavefunction ansatz th...
Main Authors: | Kyle Sprague, Stefanie Czischek |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2024-03-01
|
Series: | Communications Physics |
Online Access: | https://doi.org/10.1038/s42005-024-01584-y |
Similar Items
-
Sequential Monte Carlo with transformations
by: Everitt, RG, et al.
Published: (2019) -
Monte Carlo variational auto-encoders
by: Thin, A, et al.
Published: (2021) -
Monte-Carlo planning in large POMDPs
by: Silver, David, et al.
Published: (2015) -
Electronic excited states in deep variational Monte Carlo
by: M. T. Entwistle, et al.
Published: (2023-01-01) -
Monte Carlo Variational Method and the Ground-State of Helium
by: S. B. Doma, et al.
Published: (2009-10-01)