Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science

Deep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy.

Bibliographic Details
Main Authors: Charlotte Loh, Thomas Christensen, Rumen Dangovski, Samuel Kim, Marin Soljačić
Format: Article
Language:English
Published: Nature Portfolio 2022-07-01
Series:Nature Communications
Online Access:https://doi.org/10.1038/s41467-022-31915-y