Interpolating convolutional neural networks using batch normalization

Perceiving a visual concept as a mixture of learned ones is natural for humans, aiding them to grasp new concepts and strengthening old ones. For all their power and recent success, deep convolutional networks do not have this ability. Inspired by recent work on universal representations for neural...

पूर्ण विवरण

ग्रंथसूची विवरण
मुख्य लेखकों: Data, G, Ngu, K, Murray, D, Prisacariu, V
स्वरूप: Conference item
प्रकाशित: Springer 2018
_version_ 1826268087791386624
author Data, G
Ngu, K
Murray, D
Prisacariu, V
author_facet Data, G
Ngu, K
Murray, D
Prisacariu, V
author_sort Data, G
collection OXFORD
description Perceiving a visual concept as a mixture of learned ones is natural for humans, aiding them to grasp new concepts and strengthening old ones. For all their power and recent success, deep convolutional networks do not have this ability. Inspired by recent work on universal representations for neural networks, we propose a simple emulation of this mechanism by purposing batch normalization layers to discriminate visual classes, and formulating a way to combine them to solve new tasks. We show that this can be applied for 2-way few-shot learning where we obtain between 4% and 17% better accuracy compared to straightforward full fine-tuning, and demonstrate that it can also be extended to the orthogonal application of style transfer.
first_indexed 2024-03-06T21:04:13Z
format Conference item
id oxford-uuid:3be8efd8-cb06-458e-9f4b-62dd2d0693a9
institution University of Oxford
last_indexed 2024-03-06T21:04:13Z
publishDate 2018
publisher Springer
record_format dspace
spelling oxford-uuid:3be8efd8-cb06-458e-9f4b-62dd2d0693a92022-03-26T14:10:18ZInterpolating convolutional neural networks using batch normalizationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:3be8efd8-cb06-458e-9f4b-62dd2d0693a9Symplectic Elements at OxfordSpringer2018Data, GNgu, KMurray, DPrisacariu, VPerceiving a visual concept as a mixture of learned ones is natural for humans, aiding them to grasp new concepts and strengthening old ones. For all their power and recent success, deep convolutional networks do not have this ability. Inspired by recent work on universal representations for neural networks, we propose a simple emulation of this mechanism by purposing batch normalization layers to discriminate visual classes, and formulating a way to combine them to solve new tasks. We show that this can be applied for 2-way few-shot learning where we obtain between 4% and 17% better accuracy compared to straightforward full fine-tuning, and demonstrate that it can also be extended to the orthogonal application of style transfer.
spellingShingle Data, G
Ngu, K
Murray, D
Prisacariu, V
Interpolating convolutional neural networks using batch normalization
title Interpolating convolutional neural networks using batch normalization
title_full Interpolating convolutional neural networks using batch normalization
title_fullStr Interpolating convolutional neural networks using batch normalization
title_full_unstemmed Interpolating convolutional neural networks using batch normalization
title_short Interpolating convolutional neural networks using batch normalization
title_sort interpolating convolutional neural networks using batch normalization
work_keys_str_mv AT datag interpolatingconvolutionalneuralnetworksusingbatchnormalization
AT nguk interpolatingconvolutionalneuralnetworksusingbatchnormalization
AT murrayd interpolatingconvolutionalneuralnetworksusingbatchnormalization
AT prisacariuv interpolatingconvolutionalneuralnetworksusingbatchnormalization