Minimising disparity in distribution for unsupervised domain adaptation by preserving the local spatial arrangement of data

Domain adaptation is used for machine learning tasks, when the distribution of the training (obtained from source domain) set differs from that of the testing (referred as target domain) set. In the work presented in this study, the problem of unsupervised domain adaptation is solved using a novel o...

Full description

Bibliographic Details
Main Authors: Suranjana Samanta, Sukhendu Das
Format: Article
Language:English
Published: Wiley 2016-08-01
Series:IET Computer Vision
Subjects:
Online Access:https://doi.org/10.1049/iet-cvi.2015.0322
Description
Summary:Domain adaptation is used for machine learning tasks, when the distribution of the training (obtained from source domain) set differs from that of the testing (referred as target domain) set. In the work presented in this study, the problem of unsupervised domain adaptation is solved using a novel optimisation function to minimise the global and local discrepancies between the transformed source and the target domains. The dissimilarity in data distributions is the major contributor to the global discrepancy between the two domains. The authors propose two techniques to preserve the local structural information of source domain: (i) identify closest pair of instances in source domain and minimise the distances between these pairs of instances after transformation; (ii) preserve the naturally occurring clusters present in source domain during transformation. This cost function and constraints yield a non‐linear optimisation problem, used to estimate the weight matrix. An iterative framework solves the optimisation problem, providing a sub‐optimal solution. Next, using orthogonality constraint, an optimisation task is formulated in the Stiefel manifold. Performance analysis using real‐world datasets show that the proposed methods perform better than a few recently published state‐of‐the‐art methods.
ISSN:1751-9632
1751-9640