A Feature Selection and Classification Method for Activity Recognition Based on an Inertial Sensing Unit

The purpose of activity recognition is to identify activities through a series of observations of the experimenter’s behavior and the environmental conditions. In this study, through feature selection algorithms, we researched the effects of a large number of features on human activity rec...

Full description

Bibliographic Details
Main Authors: Shurui Fan, Yating Jia, Congyue Jia
Format: Article
Language:English
Published: MDPI AG 2019-09-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/10/10/290
Description
Summary:The purpose of activity recognition is to identify activities through a series of observations of the experimenter’s behavior and the environmental conditions. In this study, through feature selection algorithms, we researched the effects of a large number of features on human activity recognition (HAR) assisted by an inertial measurement unit (IMU), and applied them to smartphones of the future. In the research process, we considered 585 features (calculated from tri-axial accelerometer and tri-axial gyroscope data). We comprehensively analyzed the features of signals and classification methods. Three feature selection algorithms were considered, and the combination effect between the features was used to select a feature set with a significant effect on the classification of the activity, which reduced the complexity of the classifier and improved the classification accuracy. We used five classification methods (support vector machine [SVM], decision tree, linear regression, Gaussian process, and threshold selection) to verify the classification accuracy. The activity recognition method we proposed could recognize six basic activities (BAs) (standing, going upstairs, going downstairs, walking, lying, and sitting) and postural transitions (PTs) (stand-to-sit, sit-to-stand, stand-to-lie, lie-to-stand, sit-to-lie, and lie-to-sit), with an average accuracy of 96.4%.
ISSN:2078-2489