Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks

Backpropagation has been regarded as the most favorable algorithm for training artificial neural networks. However, it has been criticized for its biological implausibility because its learning mechanism contradicts the human brain. Although backpropagation has achieved super-human performance in va...

Full description

Bibliographic Details
Main Authors: Jangho Lee, Jeonghee Jo, Byounghwa Lee, Jung-Hoon Lee, Sungroh Yoon
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-11-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fncom.2022.1062678/full
_version_ 1811312701771612160
author Jangho Lee
Jeonghee Jo
Byounghwa Lee
Jung-Hoon Lee
Sungroh Yoon
Sungroh Yoon
author_facet Jangho Lee
Jeonghee Jo
Byounghwa Lee
Jung-Hoon Lee
Sungroh Yoon
Sungroh Yoon
author_sort Jangho Lee
collection DOAJ
description Backpropagation has been regarded as the most favorable algorithm for training artificial neural networks. However, it has been criticized for its biological implausibility because its learning mechanism contradicts the human brain. Although backpropagation has achieved super-human performance in various machine learning applications, it often shows limited performance in specific tasks. We collectively referred to such tasks as machine-challenging tasks (MCTs) and aimed to investigate methods to enhance machine learning for MCTs. Specifically, we start with a natural question: Can a learning mechanism that mimics the human brain lead to the improvement of MCT performances? We hypothesized that a learning mechanism replicating the human brain is effective for tasks where machine intelligence is difficult. Multiple experiments corresponding to specific types of MCTs where machine intelligence has room to improve performance were performed using predictive coding, a more biologically plausible learning algorithm than backpropagation. This study regarded incremental learning, long-tailed, and few-shot recognition as representative MCTs. With extensive experiments, we examined the effectiveness of predictive coding that robustly outperformed backpropagation-trained networks for the MCTs. We demonstrated that predictive coding-based incremental learning alleviates the effect of catastrophic forgetting. Next, predictive coding-based learning mitigates the classification bias in long-tailed recognition. Finally, we verified that the network trained with predictive coding could correctly predict corresponding targets with few samples. We analyzed the experimental result by drawing analogies between the properties of predictive coding networks and those of the human brain and discussing the potential of predictive coding networks in general machine learning.
first_indexed 2024-04-13T10:41:21Z
format Article
id doaj.art-5b55854b1d004f1faf56f2d1ab46e63b
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-04-13T10:41:21Z
publishDate 2022-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-5b55854b1d004f1faf56f2d1ab46e63b2022-12-22T02:49:55ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882022-11-011610.3389/fncom.2022.10626781062678Brain-inspired Predictive Coding Improves the Performance of Machine Challenging TasksJangho Lee0Jeonghee Jo1Byounghwa Lee2Jung-Hoon Lee3Sungroh Yoon4Sungroh Yoon5Department of Electrical and Computer Engineering, Seoul National University, Seoul, South KoreaInstitute of New Media and Communications, Seoul National University, Seoul, South KoreaCybreBrain Research Section, Electronics and Telecommunications Research Institute (ETRI), Daejeon, South KoreaCybreBrain Research Section, Electronics and Telecommunications Research Institute (ETRI), Daejeon, South KoreaDepartment of Electrical and Computer Engineering, Seoul National University, Seoul, South KoreaInterdisciplinary Program in Artificial Intelligence, Seoul National University, Seoul, South KoreaBackpropagation has been regarded as the most favorable algorithm for training artificial neural networks. However, it has been criticized for its biological implausibility because its learning mechanism contradicts the human brain. Although backpropagation has achieved super-human performance in various machine learning applications, it often shows limited performance in specific tasks. We collectively referred to such tasks as machine-challenging tasks (MCTs) and aimed to investigate methods to enhance machine learning for MCTs. Specifically, we start with a natural question: Can a learning mechanism that mimics the human brain lead to the improvement of MCT performances? We hypothesized that a learning mechanism replicating the human brain is effective for tasks where machine intelligence is difficult. Multiple experiments corresponding to specific types of MCTs where machine intelligence has room to improve performance were performed using predictive coding, a more biologically plausible learning algorithm than backpropagation. This study regarded incremental learning, long-tailed, and few-shot recognition as representative MCTs. With extensive experiments, we examined the effectiveness of predictive coding that robustly outperformed backpropagation-trained networks for the MCTs. We demonstrated that predictive coding-based incremental learning alleviates the effect of catastrophic forgetting. Next, predictive coding-based learning mitigates the classification bias in long-tailed recognition. Finally, we verified that the network trained with predictive coding could correctly predict corresponding targets with few samples. We analyzed the experimental result by drawing analogies between the properties of predictive coding networks and those of the human brain and discussing the potential of predictive coding networks in general machine learning.https://www.frontiersin.org/articles/10.3389/fncom.2022.1062678/fullbrain-inspired learningbiologically plausible learningdeep learningbackpropagationpredictive coding
spellingShingle Jangho Lee
Jeonghee Jo
Byounghwa Lee
Jung-Hoon Lee
Sungroh Yoon
Sungroh Yoon
Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
Frontiers in Computational Neuroscience
brain-inspired learning
biologically plausible learning
deep learning
backpropagation
predictive coding
title Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
title_full Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
title_fullStr Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
title_full_unstemmed Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
title_short Brain-inspired Predictive Coding Improves the Performance of Machine Challenging Tasks
title_sort brain inspired predictive coding improves the performance of machine challenging tasks
topic brain-inspired learning
biologically plausible learning
deep learning
backpropagation
predictive coding
url https://www.frontiersin.org/articles/10.3389/fncom.2022.1062678/full
work_keys_str_mv AT jangholee braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks
AT jeongheejo braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks
AT byounghwalee braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks
AT junghoonlee braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks
AT sungrohyoon braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks
AT sungrohyoon braininspiredpredictivecodingimprovestheperformanceofmachinechallengingtasks