Table of Contents:
  • 1 Basic concepts and statement of problems in control theory
  • 1.1 Initial Premises
  • 1.2 Basic concepts of control theory
  • 1.3 Modelling of control objects and their general characteristics
  • 1.4 Precising the statement of the control problem
  • 2 Finite time period control
  • 2.1 Dynamic programming
  • 2.2 Stochastic control systems
  • 2.3 Stochastic dynamic programming
  • 2.4 Bayesian control strategy
  • 2.5 Linear quadratic Gaussian Problem
  • 2.A Appendix
  • 2.P Proofs of lemmas and theorems
  • 3 Infinite time period control
  • 3.1 Stabilitzation of dynamic systems using Liapunov’s method
  • 3.2 Discrete form for analytical design of regulators
  • 3.3 Transfer function method in linear optimization problem
  • 3.4 Limiting optimal control of stochastic processes
  • 3.5 Minimax control
  • 3.A Appendix
  • 3.P Proofs of the lemmas and theorems
  • 4 Adaptive linear control systems with bounded noise
  • 4.1 Fundamentals of adaptive control
  • 4.2 Existence of adaptive control strategy in a minimax control problem
  • 4.3 Self-tuning systems
  • 4.P Proofs of the lemmas and theorems
  • 5 The problem of dynamic system identification
  • 5.1 Optimal recursive estimation
  • 5.2 The Kalman-Bucy filter for tracking the parameter drift in dynamic systems
  • 5.3 Recursive estimation
  • 5.4 Identification of a linear control object in the presence of correlated noise
  • 5.5 Identification of control objects using test signals
  • 5.P Proofs of lemmas and theorems
  • 6 Adaptive control of stochastic systems
  • 6.1 Dual control
  • 6.2 Initial synthesis of adaptive control strategy in presence of the correlated noise
  • 6.3 Design of the adaptive minimax control
  • 6.P Proofs of the lemmas and the theorems
  • Comments
  • References
  • Operators and Notational Conventions