Multi-Source Domain Adaptation with Mixture of Experts
© 2018 Association for Computational Linguistics We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to...
Main Authors: | Guo, Jiang, Shah, Darsh, Barzilay, Regina |
---|---|
Other Authors: | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics (ACL)
2021
|
Online Access: | https://hdl.handle.net/1721.1/137419 |
Similar Items
-
Multi-source domain adaptation with mixture of experts
by: Shah, Darsh J.(Darsh Jaidip)
Published: (2019) -
Nutri-bullets Hybrid: Consensual Multi-document Summarization
by: Shah, Darsh, et al.
Published: (2022) -
Aspect-augmented Adversarial Networks for Domain Adaptation
by: Zhang, Yuan, et al.
Published: (2021) -
Automatic online multi-source domain adaptation
by: Xie, Renchunzi, et al.
Published: (2022) -
Domain consistency regularization for unsupervised multi-source domain adaptive classification
by: Luo, Zhipeng, et al.
Published: (2023)