Stable Hybrid Feature Selection Method for Compressor Fault Diagnosis

Faulty compressors must be detected in advance to speed up the quality control process of the compressor’s performance. Machine learning models have recently been used as fault classification models to distinguish between normal and abnormal compressors, facilitating more sophisticated fa...

Full description

Bibliographic Details
Main Authors: Solichin Mochammad, Young-Jin Kang, Yoojeong Noh, Sunhwa Park, Byeongha Ahn
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9466109/
Description
Summary:Faulty compressors must be detected in advance to speed up the quality control process of the compressor’s performance. Machine learning models have recently been used as fault classification models to distinguish between normal and abnormal compressors, facilitating more sophisticated fault detection methods than those in the past. However, very few studies have been conducted on accurate and efficient feature selection, despite its high importance. Therefore, this study proposes a new hybrid method that combines the merits of existing feature methods, filter and wrapper methods, to obtain a stable, accurate, and efficient fault classification model. For this, three types of filtering methods with different characteristics, such as chi-square, extra tree classifier, and correlation matrix, are used to derive the high-ranked features and then create a powerful feature set consisting of their union sets. Subsequently, using the wrapper method, one combination of features with the highest classification accuracy was selected among all the combinations of features in the union feature set. Using two experimental examples and one numerical example with different types and numbers of data, the robustness and accuracy of the proposed method were verified through comparison with the existing filter methods by combining three classification models: support vector machine, K-nearest neighbor, and multi-layer perceptron.
ISSN:2169-3536