Rewriting a Deep Generative Model

© 2020, Springer Nature Switzerland AG. A deep generative model such as a GAN learns to model a rich set of semantic and physical rules about the target distribution, but up to now, it has been obscure how such rules are encoded in the network, or how a rule could be changed. In this paper, we intro...

Full description

Bibliographic Details
Main Authors: Bau, D, Liu, S, Wang, T, Zhu, JY, Torralba, A
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: Springer International Publishing 2021
Online Access:https://hdl.handle.net/1721.1/137596
_version_ 1826208147947126784
author Bau, D
Liu, S
Wang, T
Zhu, JY
Torralba, A
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Bau, D
Liu, S
Wang, T
Zhu, JY
Torralba, A
author_sort Bau, D
collection MIT
description © 2020, Springer Nature Switzerland AG. A deep generative model such as a GAN learns to model a rich set of semantic and physical rules about the target distribution, but up to now, it has been obscure how such rules are encoded in the network, or how a rule could be changed. In this paper, we introduce a new problem setting: manipulation of specific rules encoded by a deep generative model. To address the problem, we propose a formulation in which the desired rule is changed by manipulating a layer of a deep network as a linear associative memory. We derive an algorithm for modifying one entry of the associative memory, and we demonstrate that several interesting structural rules can be located and modified within the layers of state-of-the-art generative models. We present a user interface to enable users to interactively change the rules of a generative model to achieve desired effects, and we show several proof-of-concept applications. Finally, results on multiple datasets demonstrate the advantage of our method against standard fine-tuning methods and edit transfer algorithms.
first_indexed 2024-09-23T14:01:16Z
format Article
id mit-1721.1/137596
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T14:01:16Z
publishDate 2021
publisher Springer International Publishing
record_format dspace
spelling mit-1721.1/1375962023-06-20T17:31:28Z Rewriting a Deep Generative Model Bau, D Liu, S Wang, T Zhu, JY Torralba, A Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science © 2020, Springer Nature Switzerland AG. A deep generative model such as a GAN learns to model a rich set of semantic and physical rules about the target distribution, but up to now, it has been obscure how such rules are encoded in the network, or how a rule could be changed. In this paper, we introduce a new problem setting: manipulation of specific rules encoded by a deep generative model. To address the problem, we propose a formulation in which the desired rule is changed by manipulating a layer of a deep network as a linear associative memory. We derive an algorithm for modifying one entry of the associative memory, and we demonstrate that several interesting structural rules can be located and modified within the layers of state-of-the-art generative models. We present a user interface to enable users to interactively change the rules of a generative model to achieve desired effects, and we show several proof-of-concept applications. Finally, results on multiple datasets demonstrate the advantage of our method against standard fine-tuning methods and edit transfer algorithms. 2021-11-05T19:21:45Z 2021-11-05T19:21:45Z 2020 2021-01-28T16:05:27Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137596 Bau, D, Liu, S, Wang, T, Zhu, JY and Torralba, A. 2020. "Rewriting a Deep Generative Model." Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12346 LNCS. en 10.1007/978-3-030-58452-8_21 Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Springer International Publishing arXiv
spellingShingle Bau, D
Liu, S
Wang, T
Zhu, JY
Torralba, A
Rewriting a Deep Generative Model
title Rewriting a Deep Generative Model
title_full Rewriting a Deep Generative Model
title_fullStr Rewriting a Deep Generative Model
title_full_unstemmed Rewriting a Deep Generative Model
title_short Rewriting a Deep Generative Model
title_sort rewriting a deep generative model
url https://hdl.handle.net/1721.1/137596
work_keys_str_mv AT baud rewritingadeepgenerativemodel
AT lius rewritingadeepgenerativemodel
AT wangt rewritingadeepgenerativemodel
AT zhujy rewritingadeepgenerativemodel
AT torralbaa rewritingadeepgenerativemodel