Multi-Source Domain Adaptation with Mixture of Experts

© 2018 Association for Computational Linguistics We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to...

Full description

Bibliographic Details
Main Authors: Guo, Jiang, Shah, Darsh, Barzilay, Regina
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: Association for Computational Linguistics (ACL) 2021
Online Access:https://hdl.handle.net/1721.1/137419