Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis
The recent work of Gatys et al., who characterized the style of an image by the statistics of convolutional neural network filters, ignited a renewed interest in the texture generation and image stylization problems. While their image generation technique uses a slow optimization process, recently s...
Main Authors: | , , |
---|---|
Format: | Conference item |
Published: |
Institute of Electrical and Electronics Engineers
2017
|
_version_ | 1826290659076603904 |
---|---|
author | Ulyanov, D Vedaldi, A Lempitsky, V |
author_facet | Ulyanov, D Vedaldi, A Lempitsky, V |
author_sort | Ulyanov, D |
collection | OXFORD |
description | The recent work of Gatys et al., who characterized the style of an image by the statistics of convolutional neural network filters, ignited a renewed interest in the texture generation and image stylization problems. While their image generation technique uses a slow optimization process, recently several authors have proposed to learn generator neural networks that can produce similar outputs in one quick forward pass. While generator networks are promising, they are still inferior in visual quality and diversity compared to generation-by-optimization. In this work, we advance them in two significant ways. First, we introduce an instance normalization module to replace batch normalization with significant improvements to the quality of image stylization. Second, we improve diversity by introducing a new learning formulation that encourages generators to sample unbiasedly from the Julesz texture ensemble, which is the equivalence class of all images characterized by certain filter responses. Together, these two improvements take feed forward texture synthesis and image stylization much closer to the quality of generation-via-optimization, while retaining the speed advantage. |
first_indexed | 2024-03-07T02:47:34Z |
format | Conference item |
id | oxford-uuid:ac8980a6-8bb7-400c-8291-b3d83c336289 |
institution | University of Oxford |
last_indexed | 2024-03-07T02:47:34Z |
publishDate | 2017 |
publisher | Institute of Electrical and Electronics Engineers |
record_format | dspace |
spelling | oxford-uuid:ac8980a6-8bb7-400c-8291-b3d83c3362892022-03-27T03:29:46ZImproved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesisConference itemhttp://purl.org/coar/resource_type/c_5794uuid:ac8980a6-8bb7-400c-8291-b3d83c336289Symplectic Elements at OxfordInstitute of Electrical and Electronics Engineers2017Ulyanov, DVedaldi, ALempitsky, VThe recent work of Gatys et al., who characterized the style of an image by the statistics of convolutional neural network filters, ignited a renewed interest in the texture generation and image stylization problems. While their image generation technique uses a slow optimization process, recently several authors have proposed to learn generator neural networks that can produce similar outputs in one quick forward pass. While generator networks are promising, they are still inferior in visual quality and diversity compared to generation-by-optimization. In this work, we advance them in two significant ways. First, we introduce an instance normalization module to replace batch normalization with significant improvements to the quality of image stylization. Second, we improve diversity by introducing a new learning formulation that encourages generators to sample unbiasedly from the Julesz texture ensemble, which is the equivalence class of all images characterized by certain filter responses. Together, these two improvements take feed forward texture synthesis and image stylization much closer to the quality of generation-via-optimization, while retaining the speed advantage. |
spellingShingle | Ulyanov, D Vedaldi, A Lempitsky, V Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title | Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title_full | Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title_fullStr | Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title_full_unstemmed | Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title_short | Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis |
title_sort | improved texture networks maximizing quality and diversity in feed forward stylization and texture synthesis |
work_keys_str_mv | AT ulyanovd improvedtexturenetworksmaximizingqualityanddiversityinfeedforwardstylizationandtexturesynthesis AT vedaldia improvedtexturenetworksmaximizingqualityanddiversityinfeedforwardstylizationandtexturesynthesis AT lempitskyv improvedtexturenetworksmaximizingqualityanddiversityinfeedforwardstylizationandtexturesynthesis |