A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning

In this paper, a novel loss function is proposed to measure the correlation among different learning tasks and select useful feature components for each classification task. Firstly, the knowledge map we proposed is used for organizing the affiliation relationship between objects in natural world. S...

Full description

Bibliographic Details
Main Authors: Guiqing He, Yincheng Huo, Mingyao He, Haixi Zhang, Jianping Fan
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9057591/
_version_ 1818427959164272640
author Guiqing He
Yincheng Huo
Mingyao He
Haixi Zhang
Jianping Fan
author_facet Guiqing He
Yincheng Huo
Mingyao He
Haixi Zhang
Jianping Fan
author_sort Guiqing He
collection DOAJ
description In this paper, a novel loss function is proposed to measure the correlation among different learning tasks and select useful feature components for each classification task. Firstly, the knowledge map we proposed is used for organizing the affiliation relationship between objects in natural world. Secondly, a novel loss function-orthogonality loss is proposed to make the deep features more discriminative by removing useless feature components. Furthermore, in order to prevent the extracted feature maps from being too divergent and causing over-fitting which will reduce network performance, this paper also added the orthogonal distribution regularization term to constrain the distribution of network parameters. Finally, the proposed orthogonality loss is applied in a multi-task network structure to learn more discriminative deep feature, and also to evaluate the validity of the proposed loss function.The results show that compared with the traditional deep convolutional neural network and a multi-task network without orthogonality loss, the multi -task based orthogonality loss is significantly better than the other two types of networks on image classification.
first_indexed 2024-12-14T14:54:00Z
format Article
id doaj.art-2ce564ac69f54fc2bcf07964442e80dc
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-14T14:54:00Z
publishDate 2020-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-2ce564ac69f54fc2bcf07964442e80dc2022-12-21T22:57:03ZengIEEEIEEE Access2169-35362020-01-018677356774410.1109/ACCESS.2020.29859919057591A Novel Orthogonality Loss for Deep Hierarchical Multi-Task LearningGuiqing He0Yincheng Huo1Mingyao He2Haixi Zhang3Jianping Fan4https://orcid.org/0000-0003-2693-1172School of Electronics and Information, Northwestern Polytechnical University, Xi’an, ChinaSchool of Electronics and Information, Northwestern Polytechnical University, Xi’an, ChinaSchool of Electronics and Information, Northwestern Polytechnical University, Xi’an, ChinaSchool of Electronics and Information, Northwestern Polytechnical University, Xi’an, ChinaDepartment of Computer Science, University of North Carolina at Charlotte, Charlotte, NC, USAIn this paper, a novel loss function is proposed to measure the correlation among different learning tasks and select useful feature components for each classification task. Firstly, the knowledge map we proposed is used for organizing the affiliation relationship between objects in natural world. Secondly, a novel loss function-orthogonality loss is proposed to make the deep features more discriminative by removing useless feature components. Furthermore, in order to prevent the extracted feature maps from being too divergent and causing over-fitting which will reduce network performance, this paper also added the orthogonal distribution regularization term to constrain the distribution of network parameters. Finally, the proposed orthogonality loss is applied in a multi-task network structure to learn more discriminative deep feature, and also to evaluate the validity of the proposed loss function.The results show that compared with the traditional deep convolutional neural network and a multi-task network without orthogonality loss, the multi -task based orthogonality loss is significantly better than the other two types of networks on image classification.https://ieeexplore.ieee.org/document/9057591/Orthogonality lossmulti-task learningorthogonal distribution regularization
spellingShingle Guiqing He
Yincheng Huo
Mingyao He
Haixi Zhang
Jianping Fan
A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
IEEE Access
Orthogonality loss
multi-task learning
orthogonal distribution regularization
title A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
title_full A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
title_fullStr A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
title_full_unstemmed A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
title_short A Novel Orthogonality Loss for Deep Hierarchical Multi-Task Learning
title_sort novel orthogonality loss for deep hierarchical multi task learning
topic Orthogonality loss
multi-task learning
orthogonal distribution regularization
url https://ieeexplore.ieee.org/document/9057591/
work_keys_str_mv AT guiqinghe anovelorthogonalitylossfordeephierarchicalmultitasklearning
AT yinchenghuo anovelorthogonalitylossfordeephierarchicalmultitasklearning
AT mingyaohe anovelorthogonalitylossfordeephierarchicalmultitasklearning
AT haixizhang anovelorthogonalitylossfordeephierarchicalmultitasklearning
AT jianpingfan anovelorthogonalitylossfordeephierarchicalmultitasklearning
AT guiqinghe novelorthogonalitylossfordeephierarchicalmultitasklearning
AT yinchenghuo novelorthogonalitylossfordeephierarchicalmultitasklearning
AT mingyaohe novelorthogonalitylossfordeephierarchicalmultitasklearning
AT haixizhang novelorthogonalitylossfordeephierarchicalmultitasklearning
AT jianpingfan novelorthogonalitylossfordeephierarchicalmultitasklearning