Energy-based analog neural network framework

Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the proces...

Full description

Bibliographic Details
Main Authors: Mohamed Watfa, Alberto Garcia-Ortiz, Gilles Sassatelli
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-03-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fncom.2023.1114651/full
_version_ 1811160817343660032
author Mohamed Watfa
Mohamed Watfa
Alberto Garcia-Ortiz
Gilles Sassatelli
author_facet Mohamed Watfa
Mohamed Watfa
Alberto Garcia-Ortiz
Gilles Sassatelli
author_sort Mohamed Watfa
collection DOAJ
description Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the process of building, training, and evaluating mixed-signal neural models is slow and laborious. In this paper, we introduce an open-source framework, called EBANA, that provides a unified, modularized, and extensible infrastructure, similar to conventional machine learning pipelines, for building and validating analog neural networks (ANNs). It uses Python as interface language with a syntax similar to Keras, while hiding the complexity of the underlying analog simulations. It already includes the most common building blocks and maintains sufficient modularity and extensibility to easily incorporate new concepts, electrical, and technological models. These features make EBANA suitable for researchers and practitioners to experiment with different design topologies and explore the various tradeoffs that exist in the design space. We illustrate the framework capabilities by elaborating on the increasingly popular Energy-Based Models (EBMs), used in conjunction with the local Equilibrium Propagation (EP) training algorithm. Our experiments cover 3 datasets having up to 60,000 entries and explore network topologies generating circuits in excess of 1,000 electrical nodes that can be extensively benchmarked with ease and in reasonable time thanks to the native EBANA parallelization capability.
first_indexed 2024-04-10T06:04:00Z
format Article
id doaj.art-7eeed83e220d497b9d9704921e1c923d
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-04-10T06:04:00Z
publishDate 2023-03-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-7eeed83e220d497b9d9704921e1c923d2023-03-03T05:15:17ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882023-03-011710.3389/fncom.2023.11146511114651Energy-based analog neural network frameworkMohamed Watfa0Mohamed Watfa1Alberto Garcia-Ortiz2Gilles Sassatelli3LIRMM, University of Montpellier, CNRS, Montpellier, FranceITEM, University of Bremen, Bremen, GermanyITEM, University of Bremen, Bremen, GermanyLIRMM, University of Montpellier, CNRS, Montpellier, FranceOver the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the process of building, training, and evaluating mixed-signal neural models is slow and laborious. In this paper, we introduce an open-source framework, called EBANA, that provides a unified, modularized, and extensible infrastructure, similar to conventional machine learning pipelines, for building and validating analog neural networks (ANNs). It uses Python as interface language with a syntax similar to Keras, while hiding the complexity of the underlying analog simulations. It already includes the most common building blocks and maintains sufficient modularity and extensibility to easily incorporate new concepts, electrical, and technological models. These features make EBANA suitable for researchers and practitioners to experiment with different design topologies and explore the various tradeoffs that exist in the design space. We illustrate the framework capabilities by elaborating on the increasingly popular Energy-Based Models (EBMs), used in conjunction with the local Equilibrium Propagation (EP) training algorithm. Our experiments cover 3 datasets having up to 60,000 entries and explore network topologies generating circuits in excess of 1,000 electrical nodes that can be extensively benchmarked with ease and in reasonable time thanks to the native EBANA parallelization capability.https://www.frontiersin.org/articles/10.3389/fncom.2023.1114651/fullneural networksenergy-based modelsequilibrium propagationframeworkanalogmixed-signal
spellingShingle Mohamed Watfa
Mohamed Watfa
Alberto Garcia-Ortiz
Gilles Sassatelli
Energy-based analog neural network framework
Frontiers in Computational Neuroscience
neural networks
energy-based models
equilibrium propagation
framework
analog
mixed-signal
title Energy-based analog neural network framework
title_full Energy-based analog neural network framework
title_fullStr Energy-based analog neural network framework
title_full_unstemmed Energy-based analog neural network framework
title_short Energy-based analog neural network framework
title_sort energy based analog neural network framework
topic neural networks
energy-based models
equilibrium propagation
framework
analog
mixed-signal
url https://www.frontiersin.org/articles/10.3389/fncom.2023.1114651/full
work_keys_str_mv AT mohamedwatfa energybasedanalogneuralnetworkframework
AT mohamedwatfa energybasedanalogneuralnetworkframework
AT albertogarciaortiz energybasedanalogneuralnetworkframework
AT gillessassatelli energybasedanalogneuralnetworkframework