Integrating Pretrained Encoders for Generalized Face Frontalization
In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-traine...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10472503/ |
_version_ | 1797236544129466368 |
---|---|
author | Wonyoung Choi Gi Pyo Nam Junghyun Cho Ig-Jae Kim Hyeong-Seok Ko |
author_facet | Wonyoung Choi Gi Pyo Nam Junghyun Cho Ig-Jae Kim Hyeong-Seok Ko |
author_sort | Wonyoung Choi |
collection | DOAJ |
description | In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the “independent critic” as well as “dependent critic”, which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed. |
first_indexed | 2024-04-24T17:05:32Z |
format | Article |
id | doaj.art-0623a985be3a4ed7bf903cb30e0b5e75 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-24T17:05:32Z |
publishDate | 2024-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-0623a985be3a4ed7bf903cb30e0b5e752024-03-28T23:00:33ZengIEEEIEEE Access2169-35362024-01-0112435304353910.1109/ACCESS.2024.337722010472503Integrating Pretrained Encoders for Generalized Face FrontalizationWonyoung Choi0https://orcid.org/0009-0001-2438-6855Gi Pyo Nam1https://orcid.org/0000-0002-3383-7806Junghyun Cho2https://orcid.org/0000-0003-1913-8037Ig-Jae Kim3https://orcid.org/0000-0002-2741-7047Hyeong-Seok Ko4Seoul National University, Seoul, Republic of KoreaKorea Institute of Science and Technology, Seoul, Republic of KoreaKorea Institute of Science and Technology, Seoul, Republic of KoreaKorea Institute of Science and Technology, Seoul, Republic of KoreaSeoul National University, Seoul, Republic of KoreaIn the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the “independent critic” as well as “dependent critic”, which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed.https://ieeexplore.ieee.org/document/10472503/Face frontalizationface pose normalizationface recognitiongenerative modeling |
spellingShingle | Wonyoung Choi Gi Pyo Nam Junghyun Cho Ig-Jae Kim Hyeong-Seok Ko Integrating Pretrained Encoders for Generalized Face Frontalization IEEE Access Face frontalization face pose normalization face recognition generative modeling |
title | Integrating Pretrained Encoders for Generalized Face Frontalization |
title_full | Integrating Pretrained Encoders for Generalized Face Frontalization |
title_fullStr | Integrating Pretrained Encoders for Generalized Face Frontalization |
title_full_unstemmed | Integrating Pretrained Encoders for Generalized Face Frontalization |
title_short | Integrating Pretrained Encoders for Generalized Face Frontalization |
title_sort | integrating pretrained encoders for generalized face frontalization |
topic | Face frontalization face pose normalization face recognition generative modeling |
url | https://ieeexplore.ieee.org/document/10472503/ |
work_keys_str_mv | AT wonyoungchoi integratingpretrainedencodersforgeneralizedfacefrontalization AT gipyonam integratingpretrainedencodersforgeneralizedfacefrontalization AT junghyuncho integratingpretrainedencodersforgeneralizedfacefrontalization AT igjaekim integratingpretrainedencodersforgeneralizedfacefrontalization AT hyeongseokko integratingpretrainedencodersforgeneralizedfacefrontalization |