Controlled Markov Processes and Viscosity Solutions

This book is intended as an introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions. Stochastic control problems are treated using the dynamic programming approach. The authors approach stochastic control problems by the method of dyna...

Full description

Bibliographic Details
Main Authors: Fleming, Wendell H., Soner, Halil Mete (Author)
Format: eBook
Language:English
Published: New York, NY Springer New York 2006, 2006
Edition:2nd ed. 2006
Series:Stochastic Modelling and Applied Probability
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
LEADER 03246nmm a2200421 u 4500
001 EB000354743
003 EBX01000000000000000207795
005 00000000000000.0
007 cr|||||||||||||||||||||
008 130626 ||| eng
020 |a 9780387310718 
100 1 |a Fleming, Wendell H. 
245 0 0 |a Controlled Markov Processes and Viscosity Solutions  |h Elektronische Ressource  |c by Wendell H. Fleming, Halil Mete Soner 
250 |a 2nd ed. 2006 
260 |a New York, NY  |b Springer New York  |c 2006, 2006 
300 |a XVII, 429 p  |b online resource 
505 0 |a Deterministic Optimal Control -- Viscosity Solutions -- Optimal Control of Markov Processes: Classical Solutions -- Controlled Markov Diffusions in ?n -- Viscosity Solutions: Second-Order Case -- Logarithmic Transformations and Risk Sensitivity -- Singular Perturbations -- Singular Stochastic Control -- Finite Difference Numerical Approximations -- Applications to Finance -- Differential Games 
653 |a Mathematics in Business, Economics and Finance 
653 |a Operations research 
653 |a Control, Robotics, Automation 
653 |a Control theory 
653 |a Systems Theory, Control 
653 |a Probability Theory 
653 |a System theory 
653 |a Social sciences / Mathematics 
653 |a Control engineering 
653 |a Robotics 
653 |a Automation 
653 |a Operations Research and Decision Theory 
653 |a Probabilities 
700 1 |a Soner, Halil Mete  |e [author] 
041 0 7 |a eng  |2 ISO 639-2 
989 |b Springer  |a Springer eBooks 2005- 
490 0 |a Stochastic Modelling and Applied Probability 
028 5 0 |a 10.1007/0-387-31071-1 
856 4 0 |u https://doi.org/10.1007/0-387-31071-1?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 519.2 
520 |a This book is intended as an introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions. Stochastic control problems are treated using the dynamic programming approach. The authors approach stochastic control problems by the method of dynamic programming. The fundamental equation of dynamic programming is a nonlinear evolution equation for the value function. For controlled Markov diffusion processes, this becomes a nonlinear partial differential equation of second order, called a Hamilton-Jacobi-Bellman (HJB) equation. Typically, the value function is not smooth enough to satisfy the HJB equation in a classical sense. Viscosity solutions provide framework in which to study HJB equations, and to prove continuous dependence of solutions on problem data. The theory is illustrated by applications from engineering, management science, and financial economics. In this second edition, new material on applications to mathematical finance has been added. Concise introductions to risk-sensitive control theory, nonlinear H-infinity control and differential games are also included. Review of the earlier edition: "This book is highly recommended to anyone who wishes to learn the dinamic principle applied to optimal stochastic control for diffusion processes. Without any doubt, this is a fine book and most likely it is going to become a classic on the area... ." SIAM Review, 1994