Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science
Deep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy.
Main Authors: | Charlotte Loh, Thomas Christensen, Rumen Dangovski, Samuel Kim, Marin Soljačić |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2022-07-01
|
Series: | Nature Communications |
Online Access: | https://doi.org/10.1038/s41467-022-31915-y |
Similar Items
-
Discovering conservation laws using optimal transport and manifold learning
by: Peter Y. Lu, et al.
Published: (2023-08-01) -
Shaping Long-lived Electron Wavepackets to Create Customizable Optical Spectra
by: Dangovski, Rumen, et al.
Published: (2021) -
Shaping long-lived electron wavepackets for customizable optical spectra
by: Dangovski, Rumen, et al.
Published: (2021) -
Representation Learning Through the Lens of Science: Symmetry, Language and Symbolic Inductive Biases
by: Dangovski, Rumen Rumenov
Published: (2023) -
Shaping Long-lived Electron Wavepackets to Create Customizable Optical Spectra
by: Dangovski, Rumen, et al.
Published: (2021)