Multi-Gear Bandits, Partial Conservation Laws, and Indexability

This paper considers what we propose to call <i>multi-gear bandits</i>, which are Markov decision processes modeling a generic dynamic and stochastic <i>project</i> fueled by a single resource and which admit multiple actions representing gears of operation naturally ordered...

Full description

Bibliographic Details
Main Author: José Niño-Mora
Format: Article
Language:English
Published: MDPI AG 2022-07-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/14/2497

Similar Items