Approximate dynamic programming using model-free Bellman Residual Elimination
This paper presents an modification to the method of Bellman Residual Elimination (BRE) for approximate dynamic programming. While prior work on BRE has focused on learning an approximate policy for an underlying Markov Decision Process (MDP) when the state transition model of the MDP is known, this...
Váldodahkkit: | Bethke, Brett M., How, Jonathan P. |
---|---|
Eará dahkkit: | Massachusetts Institute of Technology. Aerospace Controls Laboratory |
Materiálatiipa: | Artihkal |
Giella: | en_US |
Almmustuhtton: |
Institute of Electrical and Electronics Engineers / American Automatic Control Council
2011
|
Liŋkkat: | http://hdl.handle.net/1721.1/66203 https://orcid.org/0000-0001-8576-1930 |
Geahča maid
-
Approximate Dynamic Programming Using Bellman Residual Elimination and Gaussian Process Regression
Dahkki: How, Jonathan P., et al.
Almmustuhtton: (2010) -
Approximate Dynamic Programming Using Bellman Residual Elimination and Gaussian Process Regression
Dahkki: Bethke, Brett M., et al.
Almmustuhtton: (2010) -
Kernel-based approximate dynamic programming using Bellman residual elimination
Dahkki: Bethke, Brett (Brett M.)
Almmustuhtton: (2010) -
Agent capability in persistent mission planning using approximate dynamic programming
Dahkki: Bethke, Brett M., et al.
Almmustuhtton: (2010) -
Piecewise constant policy approximations to Hamilton–Jacobi–Bellman equations
Dahkki: Reisinger, C, et al.
Almmustuhtton: (2016)