Causally abstracted multi-armed bandits
Multi-armed bandits (MAB) and causal MABs (CMAB) are established frameworks for decision-making problems. The majority of prior work typically studies and solves individual MAB and CMAB in isolation for a given problem and associated data. However, decision-makers are often faced with multiple relat...
Main Authors: | Zennaro, FM, Bishop, N, Dyer, J, Felekis, Y, Calinescu, A, Wooldridge, M, Damoulas, T |
---|---|
Format: | Conference item |
Language: | English |
Published: |
PMLR
2024
|
Similar Items
-
Interventionally consistent surrogates for complex simulation models
by: Dyer, J, et al.
Published: (2024) -
Multi-armed linear bandits with latent biases
by: Kang, Qiyu, et al.
Published: (2024) -
Multi-Armed Bandits in Brain-Computer Interfaces
by: Frida Heskebeck, et al.
Published: (2022-07-01) -
Multi-arm bandit-led clustering in federated learning
by: Zhao, Joe Chen Xuan
Published: (2024) -
Stochastic control approach to the multi-armed bandit problems
by: Treetanthiploet, T
Published: (2021)