Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control

Plug-in hybrid electric vehicles (PHEVs) have been validated as a preferable solution to transportation due to its great advantages in fuel economy promotion, harmful emission reduction and mileage anxiety mitigation. While, designing an effective energy management strategy to allocate the power bet...

Full description

Bibliographic Details
Main Authors: Shiquan Shen, Shun Gao, Yonggang Liu, Yuanjian Zhang, Jiangwei Shen, Zheng Chen, Zhenzhen Lei
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9987509/
_version_ 1828863562282434560
author Shiquan Shen
Shun Gao
Yonggang Liu
Yuanjian Zhang
Jiangwei Shen
Zheng Chen
Zhenzhen Lei
author_facet Shiquan Shen
Shun Gao
Yonggang Liu
Yuanjian Zhang
Jiangwei Shen
Zheng Chen
Zhenzhen Lei
author_sort Shiquan Shen
collection DOAJ
description Plug-in hybrid electric vehicles (PHEVs) have been validated as a preferable solution to transportation due to its great advantages in fuel economy promotion, harmful emission reduction and mileage anxiety mitigation. While, designing an effective energy management strategy to allocate the power between battery and engine is critical to improve the performance of powertrain in PHEVs. To this end, a real-time energy management strategy is proposed via incorporating double-delay Q-Learning and model predictive control (MPC). First, the energy management for PHEV is transformed into a nonlinear optimal control problem, and the vehicle speed predictor based on convolutional neural network is proposed to forecast vehicle speed in MPC. Then, based on the predicted vehicle speed, the double-delay Q-Learning algorithm is implemented to solve the receding horizon optimal problem in the MPC module. The simulation is conducted to verify the performance of the proposed strategy, and the results showcase that incorporating the double-delay Q-Learning into MPC can effectively improve the adaptability of energy management to dynamic environment, and meanwhile achieve a similar fuel consumption of the offline stochastic dynamic programming-based strategy. In addition, the single-step computation time of the proposed strategy is less than 23 milliseconds, highlighting its significant potential in online implementation.
first_indexed 2024-12-13T03:46:48Z
format Article
id doaj.art-dbca72be829642c885394ec73577e75a
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-13T03:46:48Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-dbca72be829642c885394ec73577e75a2022-12-22T00:00:49ZengIEEEIEEE Access2169-35362022-01-011013107613108910.1109/ACCESS.2022.32294689987509Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction ControlShiquan Shen0Shun Gao1Yonggang Liu2https://orcid.org/0000-0001-9768-328XYuanjian Zhang3Jiangwei Shen4Zheng Chen5https://orcid.org/0000-0002-1634-7231Zhenzhen Lei6https://orcid.org/0000-0002-0783-0475Faculty of Transportation Engineering, Kunming University of Science and Technology, Kunming, ChinaFaculty of Transportation Engineering, Kunming University of Science and Technology, Kunming, ChinaState Key Laboratory of Mechanical Transmissions, College of Mechanical and Vehicle Engineering, Chongqing University, Chongqing, ChinaDepartment of Aeronautical and Automotive Engineering, Loughborough University, Loughborough, U.K.Faculty of Transportation Engineering, Kunming University of Science and Technology, Kunming, ChinaFaculty of Transportation Engineering, Kunming University of Science and Technology, Kunming, ChinaSchool of Mechanical and Power Engineering, Chongqing University of Science and Technology, Chongqing, ChinaPlug-in hybrid electric vehicles (PHEVs) have been validated as a preferable solution to transportation due to its great advantages in fuel economy promotion, harmful emission reduction and mileage anxiety mitigation. While, designing an effective energy management strategy to allocate the power between battery and engine is critical to improve the performance of powertrain in PHEVs. To this end, a real-time energy management strategy is proposed via incorporating double-delay Q-Learning and model predictive control (MPC). First, the energy management for PHEV is transformed into a nonlinear optimal control problem, and the vehicle speed predictor based on convolutional neural network is proposed to forecast vehicle speed in MPC. Then, based on the predicted vehicle speed, the double-delay Q-Learning algorithm is implemented to solve the receding horizon optimal problem in the MPC module. The simulation is conducted to verify the performance of the proposed strategy, and the results showcase that incorporating the double-delay Q-Learning into MPC can effectively improve the adaptability of energy management to dynamic environment, and meanwhile achieve a similar fuel consumption of the offline stochastic dynamic programming-based strategy. In addition, the single-step computation time of the proposed strategy is less than 23 milliseconds, highlighting its significant potential in online implementation.https://ieeexplore.ieee.org/document/9987509/Model predictive controldouble delayed Q-learningenergy management strategy (EMS)convolutional neural networkvelocity prediction
spellingShingle Shiquan Shen
Shun Gao
Yonggang Liu
Yuanjian Zhang
Jiangwei Shen
Zheng Chen
Zhenzhen Lei
Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
IEEE Access
Model predictive control
double delayed Q-learning
energy management strategy (EMS)
convolutional neural network
velocity prediction
title Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
title_full Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
title_fullStr Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
title_full_unstemmed Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
title_short Real-Time Energy Management for Plug-in Hybrid Electric Vehicles via Incorporating Double-Delay Q-Learning and Model Prediction Control
title_sort real time energy management for plug in hybrid electric vehicles via incorporating double delay q learning and model prediction control
topic Model predictive control
double delayed Q-learning
energy management strategy (EMS)
convolutional neural network
velocity prediction
url https://ieeexplore.ieee.org/document/9987509/
work_keys_str_mv AT shiquanshen realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT shungao realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT yonggangliu realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT yuanjianzhang realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT jiangweishen realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT zhengchen realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol
AT zhenzhenlei realtimeenergymanagementforpluginhybridelectricvehiclesviaincorporatingdoubledelayqlearningandmodelpredictioncontrol