A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System
Mammals rely on vision and self-motion information in nature to distinguish directions and navigate accurately and stably. Inspired by the mammalian brain neurons to represent the spatial environment, the brain-inspired positioning method based on multi-sensors’ input is proposed to solve the proble...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-11-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/21/23/7988 |
_version_ | 1797507151174828032 |
---|---|
author | Yudi Chen Zhi Xiong Jianye Liu Chuang Yang Lijun Chao Yang Peng |
author_facet | Yudi Chen Zhi Xiong Jianye Liu Chuang Yang Lijun Chao Yang Peng |
author_sort | Yudi Chen |
collection | DOAJ |
description | Mammals rely on vision and self-motion information in nature to distinguish directions and navigate accurately and stably. Inspired by the mammalian brain neurons to represent the spatial environment, the brain-inspired positioning method based on multi-sensors’ input is proposed to solve the problem of accurate navigation in the absence of satellite signals. In the research related to the application of brain-inspired engineering, it is not common to fuse various sensor information to improve positioning accuracy and decode navigation parameters from the encoded information of the brain-inspired model. Therefore, this paper establishes the head-direction cell model and the place cell model with application potential based on continuous attractor neural networks (CANNs) to encode visual and inertial input information, and then decodes the direction and position according to the population neuron firing response. The experimental results confirm that the brain-inspired navigation model integrates a variety of information, outputs more accurate and stable navigation parameters, and generates motion paths. The proposed model promotes the effective development of brain-inspired navigation research. |
first_indexed | 2024-03-10T04:45:32Z |
format | Article |
id | doaj.art-87d816b2c43842efb702a7aa406aa523 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T04:45:32Z |
publishDate | 2021-11-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-87d816b2c43842efb702a7aa406aa5232023-11-23T03:02:15ZengMDPI AGSensors1424-82202021-11-012123798810.3390/s21237988A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation SystemYudi Chen0Zhi Xiong1Jianye Liu2Chuang Yang3Lijun Chao4Yang Peng5Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, ChinaNavigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, ChinaNavigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, ChinaNavigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, ChinaNavigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, ChinaShanghai Aerospace Control Technology Institute, Shanghai 201108, ChinaMammals rely on vision and self-motion information in nature to distinguish directions and navigate accurately and stably. Inspired by the mammalian brain neurons to represent the spatial environment, the brain-inspired positioning method based on multi-sensors’ input is proposed to solve the problem of accurate navigation in the absence of satellite signals. In the research related to the application of brain-inspired engineering, it is not common to fuse various sensor information to improve positioning accuracy and decode navigation parameters from the encoded information of the brain-inspired model. Therefore, this paper establishes the head-direction cell model and the place cell model with application potential based on continuous attractor neural networks (CANNs) to encode visual and inertial input information, and then decodes the direction and position according to the population neuron firing response. The experimental results confirm that the brain-inspired navigation model integrates a variety of information, outputs more accurate and stable navigation parameters, and generates motion paths. The proposed model promotes the effective development of brain-inspired navigation research.https://www.mdpi.com/1424-8220/21/23/7988brain-inspired navigationplace cellshead-direction cellscontinuous attractor neural networks (CANNs)population neuron decoding |
spellingShingle | Yudi Chen Zhi Xiong Jianye Liu Chuang Yang Lijun Chao Yang Peng A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System Sensors brain-inspired navigation place cells head-direction cells continuous attractor neural networks (CANNs) population neuron decoding |
title | A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System |
title_full | A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System |
title_fullStr | A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System |
title_full_unstemmed | A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System |
title_short | A Positioning Method Based on Place Cells and Head-Direction Cells for Inertial/Visual Brain-Inspired Navigation System |
title_sort | positioning method based on place cells and head direction cells for inertial visual brain inspired navigation system |
topic | brain-inspired navigation place cells head-direction cells continuous attractor neural networks (CANNs) population neuron decoding |
url | https://www.mdpi.com/1424-8220/21/23/7988 |
work_keys_str_mv | AT yudichen apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT zhixiong apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT jianyeliu apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT chuangyang apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT lijunchao apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT yangpeng apositioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT yudichen positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT zhixiong positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT jianyeliu positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT chuangyang positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT lijunchao positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem AT yangpeng positioningmethodbasedonplacecellsandheaddirectioncellsforinertialvisualbraininspirednavigationsystem |