Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning
Deep learning-based state estimation of lithium batteries is widely used in battery management system (BMS) design. However, due to the limitation of on-board computing resources, multiple single-state estimation models are more difficult to deploy in practice. Therefore, this paper proposes a multi...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-03-01
|
Series: | Energies |
Subjects: | |
Online Access: | https://www.mdpi.com/1996-1073/16/7/3002 |
_version_ | 1797608061000482816 |
---|---|
author | Xiang Bao Yuefeng Liu Bo Liu Haofeng Liu Yue Wang |
author_facet | Xiang Bao Yuefeng Liu Bo Liu Haofeng Liu Yue Wang |
author_sort | Xiang Bao |
collection | DOAJ |
description | Deep learning-based state estimation of lithium batteries is widely used in battery management system (BMS) design. However, due to the limitation of on-board computing resources, multiple single-state estimation models are more difficult to deploy in practice. Therefore, this paper proposes a multi-task learning network (MTL) combining a multi-layer feature extraction structure with separated expert layers for the joint estimation of the state of charge (SOC) and state of energy (SOE) of Li-ion batteries. MTL uses a multi-layer network to extract features, separating task sharing from task-specific parameters. The underlying LSTM initially extracts time-series features. The separated expert layer, consisting of task-specific and shared experts, extracts features specific to different tasks and shared features for multiple tasks. The information extracted by different experts is fused through a gate structure. Tasks are processed based on specific and shared information. Multiple tasks are trained simultaneously to improve performance by sharing the learned knowledge with each other. SOC and SOE are estimated on the Panasonic dataset, and the model is tested for generalization performance on the LG dataset. The Mean Absolute Error (MAE) values for the two tasks are 1.01% and 0.59%, and the Root Mean Square Error (RMSE) values are 1.29% and 0.77%, respectively. For SOE estimation tasks, the MAE and RMSE values are reduced by 0.096% and 0.087%, respectively, when compared with single-task learning models. The MTL model also achieves reductions of up to 0.818% and 0.938% in MAE and RMSE values, respectively, compared to other multi-task learning models. For SOC estimation tasks, the MAE and RMSE values are reduced by 0.051% and 0.078%, respectively, compared to single-task learning models. The MTL model also outperforms other multi-task learning models, achieving reductions of up to 0.398% and 0.578% in MAE and RMSE values, respectively. In the process of simulating online prediction, the MTL model consumes 4.93 ms, which is less than the combined time of multiple single-task learning models and almost the same as that of other multi-task learning models. The results show the effectiveness and superiority of this method. |
first_indexed | 2024-03-11T05:39:18Z |
format | Article |
id | doaj.art-e1dd97bdb8384c66854888c11647e823 |
institution | Directory Open Access Journal |
issn | 1996-1073 |
language | English |
last_indexed | 2024-03-11T05:39:18Z |
publishDate | 2023-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Energies |
spelling | doaj.art-e1dd97bdb8384c66854888c11647e8232023-11-17T16:36:04ZengMDPI AGEnergies1996-10732023-03-01167300210.3390/en16073002Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task LearningXiang Bao0Yuefeng Liu1Bo Liu2Haofeng Liu3Yue Wang4School of Information Engineering, Inner Mongolia University of Science and Technology, Baotou 014010, ChinaSchool of Information Engineering, Inner Mongolia University of Science and Technology, Baotou 014010, ChinaSchool of Information Engineering, Inner Mongolia University of Science and Technology, Baotou 014010, ChinaSchool of Information Engineering, Inner Mongolia University of Science and Technology, Baotou 014010, ChinaSchool of Information Engineering, Inner Mongolia University of Science and Technology, Baotou 014010, ChinaDeep learning-based state estimation of lithium batteries is widely used in battery management system (BMS) design. However, due to the limitation of on-board computing resources, multiple single-state estimation models are more difficult to deploy in practice. Therefore, this paper proposes a multi-task learning network (MTL) combining a multi-layer feature extraction structure with separated expert layers for the joint estimation of the state of charge (SOC) and state of energy (SOE) of Li-ion batteries. MTL uses a multi-layer network to extract features, separating task sharing from task-specific parameters. The underlying LSTM initially extracts time-series features. The separated expert layer, consisting of task-specific and shared experts, extracts features specific to different tasks and shared features for multiple tasks. The information extracted by different experts is fused through a gate structure. Tasks are processed based on specific and shared information. Multiple tasks are trained simultaneously to improve performance by sharing the learned knowledge with each other. SOC and SOE are estimated on the Panasonic dataset, and the model is tested for generalization performance on the LG dataset. The Mean Absolute Error (MAE) values for the two tasks are 1.01% and 0.59%, and the Root Mean Square Error (RMSE) values are 1.29% and 0.77%, respectively. For SOE estimation tasks, the MAE and RMSE values are reduced by 0.096% and 0.087%, respectively, when compared with single-task learning models. The MTL model also achieves reductions of up to 0.818% and 0.938% in MAE and RMSE values, respectively, compared to other multi-task learning models. For SOC estimation tasks, the MAE and RMSE values are reduced by 0.051% and 0.078%, respectively, compared to single-task learning models. The MTL model also outperforms other multi-task learning models, achieving reductions of up to 0.398% and 0.578% in MAE and RMSE values, respectively. In the process of simulating online prediction, the MTL model consumes 4.93 ms, which is less than the combined time of multiple single-task learning models and almost the same as that of other multi-task learning models. The results show the effectiveness and superiority of this method.https://www.mdpi.com/1996-1073/16/7/3002deep learninglithium-ion batteriesmulti-state estimation of batteriesmulti-task learning |
spellingShingle | Xiang Bao Yuefeng Liu Bo Liu Haofeng Liu Yue Wang Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning Energies deep learning lithium-ion batteries multi-state estimation of batteries multi-task learning |
title | Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning |
title_full | Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning |
title_fullStr | Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning |
title_full_unstemmed | Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning |
title_short | Multi-State Online Estimation of Lithium-Ion Batteries Based on Multi-Task Learning |
title_sort | multi state online estimation of lithium ion batteries based on multi task learning |
topic | deep learning lithium-ion batteries multi-state estimation of batteries multi-task learning |
url | https://www.mdpi.com/1996-1073/16/7/3002 |
work_keys_str_mv | AT xiangbao multistateonlineestimationoflithiumionbatteriesbasedonmultitasklearning AT yuefengliu multistateonlineestimationoflithiumionbatteriesbasedonmultitasklearning AT boliu multistateonlineestimationoflithiumionbatteriesbasedonmultitasklearning AT haofengliu multistateonlineestimationoflithiumionbatteriesbasedonmultitasklearning AT yuewang multistateonlineestimationoflithiumionbatteriesbasedonmultitasklearning |