Mathematical Theory of Control Systems Design

Give, and it shall be given unto you. ST. LUKE, VI, 38. The book is based on several courses of lectures on control theory and appli­ cations which were delivered by the authors for a number of years at Moscow Electronics and Mathematics University. The book, originally written in Rus­ sian, was fir...

Full description

Bibliographic Details
Main Authors: Afanasiev, V.N., Kolmanovskii, V. (Author), Nosov, V.R. (Author)
Format: eBook
Language:English
Published: Dordrecht Springer Netherlands 1996, 1996
Edition:1st ed. 1996
Series:Mathematics and Its Applications
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
Table of Contents:
  • I. Continuous and Discrete Deterministic Systems
  • II. Stability of Stochastic Systems
  • III. Description of Control Problems
  • IV. The Classical Calculus of Variations and Optimal Control
  • V. The Maximum Principle
  • VI. Linear Control Systems
  • VII. Dynamic Programming Approach. Sufficient Conditions for Optimal Control
  • VIII. Some Additional Topics of Optimal Control Theory
  • IX. Control of Stochastic Systems. Problem Statements and Investigation Techniques
  • X. Optimal Control on a Time Interval of Random Duration
  • XI. Optimal Estimation of the State of the System
  • XII. Optimal Control of the Observation Process
  • XIII. Linear Time-Invariant Control Systems
  • XIV. Numerical Methods for the Investigation of Nonlinear Control Systems
  • XV. Numerical Design of Optimal Control Systems
  • General References