Texture networks: Feed-forward synthesis of textures and stylized images
Gatys et al. recently demonstrated that deep networks can generate beautiful textures and stylized images from a single texture example. However, their methods require a slow and memoryconsuming optimization process. We propose here an alternative approach that moves the computational burden to a le...
Główni autorzy: | Ulyanov, D, Lebedev, V, Vedaldi, A, Lempitsky, V |
---|---|
Format: | Conference item |
Wydane: |
Association for Computing Machinery
2016
|
Podobne zapisy
-
Improved texture networks: Maximizing quality and diversity in feed-forward stylization and texture synthesis
od: Ulyanov, D, i wsp.
Wydane: (2017) -
Video Texture Synthesis Based on Flow-Like Stylization Painting
od: Qian Wenhua, i wsp.
Wydane: (2014-01-01) -
Deep image prior
od: Ulyanov, D, i wsp.
Wydane: (2018) -
Deep image prior
od: Ulyanov, D, i wsp.
Wydane: (2020) -
It takes (only) two: adversarial generator-encoder networks
od: Ulyanov, D, i wsp.
Wydane: (2018)