MetaFun: meta-learning with iterative functional updates
We develop a functional encoder-decoder approach to supervised meta-learning, where labeled data is encoded into an infinite-dimensional functional representation rather than a finite-dimensional one. Furthermore, rather than directly producing the representation, we learn a neural update rule resem...
المؤلفون الرئيسيون: | , , , , |
---|---|
التنسيق: | Journal article |
اللغة: | English |
منشور في: |
MLResearch Press
2020
|
الملخص: | We develop a functional encoder-decoder approach to supervised meta-learning,
where labeled data is encoded into an infinite-dimensional functional
representation rather than a finite-dimensional one. Furthermore, rather than
directly producing the representation, we learn a neural update rule resembling
functional gradient descent which iteratively improves the representation. The
final representation is used to condition the decoder to make predictions on
unlabeled data. Our approach is the first to demonstrates the success of
encoder-decoder style meta-learning methods like conditional neural processes
on large-scale few-shot classification benchmarks such as miniImageNet and
tieredImageNet, where it achieves state-of-the-art performance. |
---|