ECE 515 Course outline
Course outline
I. Modeling and Analysis of Control Systems
- Introduction and classification
- State space models in both discrete and continuous time
- Linear and nonlinear systems
- Discretization and linearization
- Transfer function description of linear systems. Relationship with state space models.
- Minimal realizations. Controllable and observable forms.
- Vector spaces and linear transformations
- Review of linear algebra; the Cayley-Hamilton theorem
- State transition matrix and solutions of linear state equations
II. Structural Properties of Control Systems
- Stability (Lyapunov, Input-Output)
- Stability tests for linear systems; stability subspaces
- Stability tests for nonlinear systems
- Controllability; controllable subspaces
- Observability; unobservable subspaces
III. Feedback Controller Design
- Pole placement with state feedback
- Observers and observer-based designs
- Tracking and disturbance rejection
- Performance issues; robustness and sensitivity
IV. Optimal Control
- Dynamic programming for both discrete-time and continuous-time systems; the Hamilton-Jacobi-Bellman (HJB) equation; relationship between open-loop and closed-loop controllers
- Linear-quadratic (LQ) optimal control problem and design of optimum regulators
- The matrix Riccati differential equation and some of its properties
- The infinite-horizon case: Time-invariant optimal controllers and the algebraic Riccati equation
- The minimum principle
- Time-optimal control of continuous-time linear systems