Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn

Neuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Expl...

Full description

Bibliographic Details
Main Authors: Alper Yegenoglu, Anand Subramoney, Thorsten Hater, Cristian Jimenez-Romero, Wouter Klijn, Aarón Pérez Martín, Michiel van der Vlag, Michael Herty, Abigail Morrison, Sandra Diaz-Pier
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-05-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fncom.2022.885207/full
_version_ 1828337735355596800
author Alper Yegenoglu
Alper Yegenoglu
Anand Subramoney
Thorsten Hater
Cristian Jimenez-Romero
Wouter Klijn
Aarón Pérez Martín
Michiel van der Vlag
Michael Herty
Abigail Morrison
Abigail Morrison
Abigail Morrison
Sandra Diaz-Pier
author_facet Alper Yegenoglu
Alper Yegenoglu
Anand Subramoney
Thorsten Hater
Cristian Jimenez-Romero
Wouter Klijn
Aarón Pérez Martín
Michiel van der Vlag
Michael Herty
Abigail Morrison
Abigail Morrison
Abigail Morrison
Sandra Diaz-Pier
author_sort Alper Yegenoglu
collection DOAJ
description Neuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Exploring the high dimensional parameter space using numerical simulations has been a frequently used technique in the last years in many areas of computational neuroscience. Today, high performance computing (HPC) can provide a powerful infrastructure to speed up explorations and increase our general understanding of the behavior of the model in reasonable times. Learning to learn (L2L) is a well-known concept in machine learning (ML) and a specific method for acquiring constraints to improve learning performance. This concept can be decomposed into a two loop optimization process where the target of optimization can consist of any program such as an artificial neural network, a spiking network, a single cell model, or a whole brain simulation. In this work, we present L2L as an easy to use and flexible framework to perform parameter and hyper-parameter space exploration of neuroscience models on HPC infrastructure. Learning to learn is an implementation of the L2L concept written in Python. This open-source software allows several instances of an optimization target to be executed with different parameters in an embarrassingly parallel fashion on HPC. L2L provides a set of built-in optimizer algorithms, which make adaptive and efficient exploration of parameter spaces possible. Different from other optimization toolboxes, L2L provides maximum flexibility for the way the optimization target can be executed. In this paper, we show a variety of examples of neuroscience models being optimized within the L2L framework to execute different types of tasks. The tasks used to illustrate the concept go from reproducing empirical data to learning how to solve a problem in a dynamic environment. We particularly focus on simulations with models ranging from the single cell to the whole brain and using a variety of simulation engines like NEST, Arbor, TVB, OpenAIGym, and NetLogo.
first_indexed 2024-04-13T22:19:07Z
format Article
id doaj.art-48f75b596f42417a9a8e08f9f36d2677
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-04-13T22:19:07Z
publishDate 2022-05-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-48f75b596f42417a9a8e08f9f36d26772022-12-22T02:27:21ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882022-05-011610.3389/fncom.2022.885207885207Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to LearnAlper Yegenoglu0Alper Yegenoglu1Anand Subramoney2Thorsten Hater3Cristian Jimenez-Romero4Wouter Klijn5Aarón Pérez Martín6Michiel van der Vlag7Michael Herty8Abigail Morrison9Abigail Morrison10Abigail Morrison11Sandra Diaz-Pier12Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanyDepartment of Mathematics, Institute of Geometry and Applied Mathematics, RWTH Aachen University, Aachen, GermanyInstitute of Neural Computation, Ruhr University Bochum, Bochum, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanyDepartment of Mathematics, Institute of Geometry and Applied Mathematics, RWTH Aachen University, Aachen, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanyInstitute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich Research Centre, Jülich, GermanyComputer Science 3-Software Engineering, RWTH Aachen University, Aachen, GermanySimulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, GermanyNeuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Exploring the high dimensional parameter space using numerical simulations has been a frequently used technique in the last years in many areas of computational neuroscience. Today, high performance computing (HPC) can provide a powerful infrastructure to speed up explorations and increase our general understanding of the behavior of the model in reasonable times. Learning to learn (L2L) is a well-known concept in machine learning (ML) and a specific method for acquiring constraints to improve learning performance. This concept can be decomposed into a two loop optimization process where the target of optimization can consist of any program such as an artificial neural network, a spiking network, a single cell model, or a whole brain simulation. In this work, we present L2L as an easy to use and flexible framework to perform parameter and hyper-parameter space exploration of neuroscience models on HPC infrastructure. Learning to learn is an implementation of the L2L concept written in Python. This open-source software allows several instances of an optimization target to be executed with different parameters in an embarrassingly parallel fashion on HPC. L2L provides a set of built-in optimizer algorithms, which make adaptive and efficient exploration of parameter spaces possible. Different from other optimization toolboxes, L2L provides maximum flexibility for the way the optimization target can be executed. In this paper, we show a variety of examples of neuroscience models being optimized within the L2L framework to execute different types of tasks. The tasks used to illustrate the concept go from reproducing empirical data to learning how to solve a problem in a dynamic environment. We particularly focus on simulations with models ranging from the single cell to the whole brain and using a variety of simulation engines like NEST, Arbor, TVB, OpenAIGym, and NetLogo.https://www.frontiersin.org/articles/10.3389/fncom.2022.885207/fullsimulationmeta learninghyper-parameter optimizationhigh performance computingconnectivity generationparameter exploration
spellingShingle Alper Yegenoglu
Alper Yegenoglu
Anand Subramoney
Thorsten Hater
Cristian Jimenez-Romero
Wouter Klijn
Aarón Pérez Martín
Michiel van der Vlag
Michael Herty
Abigail Morrison
Abigail Morrison
Abigail Morrison
Sandra Diaz-Pier
Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
Frontiers in Computational Neuroscience
simulation
meta learning
hyper-parameter optimization
high performance computing
connectivity generation
parameter exploration
title Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
title_full Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
title_fullStr Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
title_full_unstemmed Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
title_short Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn
title_sort exploring parameter and hyper parameter spaces of neuroscience models on high performance computers with learning to learn
topic simulation
meta learning
hyper-parameter optimization
high performance computing
connectivity generation
parameter exploration
url https://www.frontiersin.org/articles/10.3389/fncom.2022.885207/full
work_keys_str_mv AT alperyegenoglu exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT alperyegenoglu exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT anandsubramoney exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT thorstenhater exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT cristianjimenezromero exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT wouterklijn exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT aaronperezmartin exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT michielvandervlag exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT michaelherty exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT abigailmorrison exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT abigailmorrison exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT abigailmorrison exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn
AT sandradiazpier exploringparameterandhyperparameterspacesofneurosciencemodelsonhighperformancecomputerswithlearningtolearn