Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science
Deep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy.
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2022-07-01
|
Series: | Nature Communications |
Online Access: | https://doi.org/10.1038/s41467-022-31915-y |
_version_ | 1818190148610818048 |
---|---|
author | Charlotte Loh Thomas Christensen Rumen Dangovski Samuel Kim Marin Soljačić |
author_facet | Charlotte Loh Thomas Christensen Rumen Dangovski Samuel Kim Marin Soljačić |
author_sort | Charlotte Loh |
collection | DOAJ |
description | Deep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy. |
first_indexed | 2024-12-11T23:54:07Z |
format | Article |
id | doaj.art-07829645c7eb4f6cb3801a7892810430 |
institution | Directory Open Access Journal |
issn | 2041-1723 |
language | English |
last_indexed | 2024-12-11T23:54:07Z |
publishDate | 2022-07-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Nature Communications |
spelling | doaj.art-07829645c7eb4f6cb3801a78928104302022-12-22T00:45:24ZengNature PortfolioNature Communications2041-17232022-07-0113111210.1038/s41467-022-31915-ySurrogate- and invariance-boosted contrastive learning for data-scarce applications in scienceCharlotte Loh0Thomas Christensen1Rumen Dangovski2Samuel Kim3Marin Soljačić4Department of Electrical Engineering and Computer Science, Massachusetts Institute of TechnologyDepartment of Physics, Massachusetts Institute of TechnologyDepartment of Electrical Engineering and Computer Science, Massachusetts Institute of TechnologyDepartment of Electrical Engineering and Computer Science, Massachusetts Institute of TechnologyDepartment of Physics, Massachusetts Institute of TechnologyDeep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy.https://doi.org/10.1038/s41467-022-31915-y |
spellingShingle | Charlotte Loh Thomas Christensen Rumen Dangovski Samuel Kim Marin Soljačić Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science Nature Communications |
title | Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science |
title_full | Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science |
title_fullStr | Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science |
title_full_unstemmed | Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science |
title_short | Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science |
title_sort | surrogate and invariance boosted contrastive learning for data scarce applications in science |
url | https://doi.org/10.1038/s41467-022-31915-y |
work_keys_str_mv | AT charlotteloh surrogateandinvarianceboostedcontrastivelearningfordatascarceapplicationsinscience AT thomaschristensen surrogateandinvarianceboostedcontrastivelearningfordatascarceapplicationsinscience AT rumendangovski surrogateandinvarianceboostedcontrastivelearningfordatascarceapplicationsinscience AT samuelkim surrogateandinvarianceboostedcontrastivelearningfordatascarceapplicationsinscience AT marinsoljacic surrogateandinvarianceboostedcontrastivelearningfordatascarceapplicationsinscience |