Continuous Time Dynamical Systems
- Length: 247 pages
- Edition: 1
- Language: English
- Publisher: CRC Press
- Publication Date: 2012-10-24
- ISBN-10: 1466517298
- ISBN-13: 9781466517295
- Sales Rank: #10927269 (See Top 100 Books)
Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions
Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional.
This book, Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions, considers different classes of systems with quadratic performance criteria. It then attempts to find the optimal control law for each class of systems using orthogonal functions that can optimize the given performance criteria.
Illustrated throughout with detailed examples, the book covers topics including:
- Block-pulse functions and shifted Legendre polynomials
- State estimation of linear time-invariant systems
- Linear optimal control systems incorporating observers
- Optimal control of systems described by integro-differential equations
- Linear-quadratic-Gaussian control
- Optimal control of singular systems
- Optimal control of time-delay systems with and without reverse time terms
- Optimal control of second-order nonlinear systems
- Hierarchical control of linear time-invariant and time-varying systems
Table of Contents
1 Introduction
2 Orthogonal Functions and Their Properties
3 State Estimation
4 Linear Optimal Control Systems Incorporating Observers
5 Optimal Control of Systems Described by Integro-Differential Equations
6 Linear-Quadratic-Gaussian Control
7 Optimal Control of Singular Systems
8 Optimal Control of Time-Delay Systems
9 Optimal Control of Nonlinear Systems
10 Hierarchical Control of Linear Systems
11 Epilogue