Deep Convolutional Networks are Hierarchical Kernel Machines
We extend i-theory to incorporate not only pooling but also rectifying nonlinearities in an extended HW module (eHW) designed for supervised learning. The two operations roughly correspond to invariance and selectivity, respectively. Under the assumption of normalized inputs, we show that appropriat...
Main Authors: | Anselmi, Fabio, Rosasco, Lorenzo, Tan, Cheston, Poggio, Tomaso |
---|---|
Format: | Technical Report |
Language: | en_US |
Published: |
Center for Brains, Minds and Machines (CBMM), arXiv
2015
|
Subjects: | |
Online Access: | http://hdl.handle.net/1721.1/100200 |
Similar Items
-
A Deep Representation for Invariance And Music Classification
by: Zhang, Chiyuan, et al.
Published: (2015) -
I-theory on depth vs width: hierarchical function composition
by: Poggio, Tomaso, et al.
Published: (2015) -
Symmetry Regularization
by: Anselmi, Fabio, et al.
Published: (2017) -
On Invariance and Selectivity in Representation Learning
by: Anselmi, Fabio, et al.
Published: (2015) -
The Invariance Hypothesis Implies Domain-Specific Regions in Visual Cortex
by: Leibo, Joel Z, et al.
Published: (2015)