Stochastic Control with Complete Observations on an Infinite Horizon

Jan H. van Schuppen*

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeChapterScientific

Abstract

Optimal stochastic control problems with complete observations and on an infinite horizon are considered. Control theory for both the average cost and the discounted cost function is treated. The dynamic programming approach is formulated as a procedure to determine the value and the value function; from the value function, one can derive the optimal control law. Stochastic controllability is in general needed to prove that there exists a control law with a finite average cost in case of positive cost. Special cases treated in depth are: the case of a Gaussian stochastic control system and of a finite stochastic control system.

Original languageEnglish
Title of host publicationControl and System Theory of Discrete-Time Stochastic Systems
PublisherSpringer
Pages493-546
Number of pages54
ISBN (Electronic)978-3-030-66952-2
DOIs
Publication statusPublished - 2021

Publication series

NameCommunications and Control Engineering
ISSN (Print)0178-5354
ISSN (Electronic)2197-7119

Keywords

  • Complete observations
  • Infinite horizon
  • Stochastic control

Fingerprint

Dive into the research topics of 'Stochastic Control with Complete Observations on an Infinite Horizon'. Together they form a unique fingerprint.

Cite this