How important is weight symmetry in backpropagation?

Gradient backpropagation (BP) requires symmetric feedforward and feedback connections-the same weights must be used for forward and backward passes. This "weight transport problem" (Grossberg 1987) is thought to be one of the main reasons to doubt BP's biologically plausibility. Using...

Full description

Bibliographic Details
Main Authors: Liao, Qianli, Leibo, Joel Z, Poggio, Tomaso A
Other Authors: Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
Format: Article
Published: Association for the Advancement of Artificial Intelligence 2017
Online Access:http://hdl.handle.net/1721.1/112304
https://orcid.org/0000-0003-0076-621X
https://orcid.org/0000-0002-3153-916X
https://orcid.org/0000-0002-3944-0455
_version_ 1826209327746121728
author Liao, Qianli
Leibo, Joel Z
Poggio, Tomaso A
author2 Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
author_facet Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
Liao, Qianli
Leibo, Joel Z
Poggio, Tomaso A
author_sort Liao, Qianli
collection MIT
description Gradient backpropagation (BP) requires symmetric feedforward and feedback connections-the same weights must be used for forward and backward passes. This "weight transport problem" (Grossberg 1987) is thought to be one of the main reasons to doubt BP's biologically plausibility. Using 15 different classification datasets, we systematically investigate to what extent BP really depends on weight symmetry. In a study that turned out to be surprisingly similar in spirit to Lillicrap et al.'s demonstration (Lillicrap et al. 2014) but orthogonal in its results, our experiments indicate that: (1) the magnitudes of feedback weights do not matter to performance (2) the signs of feedback weights do matter-the more concordant signs between feedforward and their corresponding feedback connections, the better (3) with feedback weights having random magnitudes and 100% concordant signs, we were able to achieve the same or even better performance than SGD. (4) some normalizations/stabilizations are indispensable for such asymmetric BP to work, namely Batch Normalization (BN) (Ioffe and Szegedy 2015) and/or a "Batch Manhattan" (BM) update rule.
first_indexed 2024-09-23T14:20:47Z
format Article
id mit-1721.1/112304
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T14:20:47Z
publishDate 2017
publisher Association for the Advancement of Artificial Intelligence
record_format dspace
spelling mit-1721.1/1123042022-10-01T20:46:00Z How important is weight symmetry in backpropagation? Liao, Qianli Leibo, Joel Z Poggio, Tomaso A Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science McGovern Institute for Brain Research at MIT Liao, Qianli Leibo, Joel Z Poggio, Tomaso A Gradient backpropagation (BP) requires symmetric feedforward and feedback connections-the same weights must be used for forward and backward passes. This "weight transport problem" (Grossberg 1987) is thought to be one of the main reasons to doubt BP's biologically plausibility. Using 15 different classification datasets, we systematically investigate to what extent BP really depends on weight symmetry. In a study that turned out to be surprisingly similar in spirit to Lillicrap et al.'s demonstration (Lillicrap et al. 2014) but orthogonal in its results, our experiments indicate that: (1) the magnitudes of feedback weights do not matter to performance (2) the signs of feedback weights do matter-the more concordant signs between feedforward and their corresponding feedback connections, the better (3) with feedback weights having random magnitudes and 100% concordant signs, we were able to achieve the same or even better performance than SGD. (4) some normalizations/stabilizations are indispensable for such asymmetric BP to work, namely Batch Normalization (BN) (Ioffe and Szegedy 2015) and/or a "Batch Manhattan" (BM) update rule. National Science Foundation (U.S.) (STC Award CCF 1231216) 2017-11-28T18:13:54Z 2017-11-28T18:13:54Z 2016-02 2017-11-17T17:55:47Z Article http://purl.org/eprint/type/ConferencePaper http://hdl.handle.net/1721.1/112304 Liao, Qianli, Joel Z. Leibo and Tomaso Poggio. "How Important is Weight Symmetry in Backpropagation." Thirty-Second AAAI Conference on Artificial Intelligence, February 12-17, 2016, Phoenix, Arizona, Association for the Advancement of Artificial Intelligence, February 2016. © 2016 Association for the Advancement of Artificial Intelligence https://orcid.org/0000-0003-0076-621X https://orcid.org/0000-0002-3153-916X https://orcid.org/0000-0002-3944-0455 https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/view/12325 Thirtieth AAAI Conference on Artificial Intelligence Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Association for the Advancement of Artificial Intelligence arXiv
spellingShingle Liao, Qianli
Leibo, Joel Z
Poggio, Tomaso A
How important is weight symmetry in backpropagation?
title How important is weight symmetry in backpropagation?
title_full How important is weight symmetry in backpropagation?
title_fullStr How important is weight symmetry in backpropagation?
title_full_unstemmed How important is weight symmetry in backpropagation?
title_short How important is weight symmetry in backpropagation?
title_sort how important is weight symmetry in backpropagation
url http://hdl.handle.net/1721.1/112304
https://orcid.org/0000-0003-0076-621X
https://orcid.org/0000-0002-3153-916X
https://orcid.org/0000-0002-3944-0455
work_keys_str_mv AT liaoqianli howimportantisweightsymmetryinbackpropagation
AT leibojoelz howimportantisweightsymmetryinbackpropagation
AT poggiotomasoa howimportantisweightsymmetryinbackpropagation