Texture networks: Feed-forward synthesis of textures and stylized images
Gatys et al. recently demonstrated that deep networks can generate beautiful textures and stylized images from a single texture example. However, their methods require a slow and memoryconsuming optimization process. We propose here an alternative approach that moves the computational burden to a le...
Главные авторы: | Ulyanov, D, Lebedev, V, Vedaldi, A, Lempitsky, V |
---|---|
Формат: | Conference item |
Опубликовано: |
Association for Computing Machinery
2016
|
Схожие документы
-
Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis
по: Ulyanov, D, и др.
Опубликовано: (2017) -
Video Texture Synthesis Based on Flow-Like Stylization Painting
по: Qian Wenhua, и др.
Опубликовано: (2014-01-01) -
Deep image prior
по: Ulyanov, D, и др.
Опубликовано: (2018) -
Deep image prior
по: Ulyanov, D, и др.
Опубликовано: (2020) -
It takes (only) two: adversarial generator-encoder networks
по: Ulyanov, D, и др.
Опубликовано: (2018)