Handbook of Markov Decision Processes Methods and Applications

Eugene A. Feinberg Adam Shwartz This volume deals with the theory of Markov Decision Processes (MDPs) and their applications. Each chapter was written by a leading expert in the re­ spective area. The papers cover major research areas and methodologies, and discuss open questions and future research...

Full description

Bibliographic Details
Other Authors: Feinberg, Eugene A. (Editor), Shwartz, Adam (Editor)
Format: eBook
Language:English
Published: New York, NY Springer US 2002, 2002
Edition:1st ed. 2002
Series:International Series in Operations Research & Management Science
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
Table of Contents:
  • 1 Introduction
  • I Finite State and Action Models
  • 2 Finite State and Action MDPs
  • 3 Bias Optimality
  • 4 Singular Perturbations of Markov Chains and Decision Processes
  • II Infinite State Models
  • 5 Average Reward Optimization Theory for Denumerable State Spaces
  • 6 Total Reward Criteria
  • 7 Mixed Criteria
  • 8 Blackwell Optimality
  • 9 The Poisson Equation for Countable Markov Chains: Probabilistic Methods and Interpretations
  • 10 Stability, Performance Evaluation, and Optimization
  • 11 Convex Analytic Methods in Markov Decision Processes
  • 12 The Linear Programming Approach
  • 13 Invariant Gambling Problems and Markov Decision Processes
  • III Applications
  • 14 Neuro-Dynamic Programming: Overview and Recent Trends
  • 15 Markov Decision Processes in Finance and Dynamic Options
  • 16 Applications of Markov Decision Processes in Communication Networks
  • 17 Water Reservoir Applications of Markov Decision Processes