ECE 515 Course outline

Course outline

I. Modeling and Analysis of Control Systems

  1. Introduction and classification
  2. State space models in both discrete and continuous time
  3. Linear and nonlinear systems
  4. Discretization and linearization
  5. Transfer function description of linear systems. Relationship with state space models.
  6. Minimal realizations. Controllable and observable forms.
  7. Vector spaces and linear transformations
  8. Review of linear algebra; the Cayley-Hamilton theorem
  9. State transition matrix and solutions of linear state equations

II. Structural Properties of Control Systems

  1. Stability (Lyapunov, Input-Output)
  2. Stability tests for linear systems; stability subspaces
  3. Stability tests for nonlinear systems
  4. Controllability; controllable subspaces
  5. Observability; unobservable subspaces

III. Feedback Controller Design

  1. Pole placement with state feedback
  2. Observers and observer-based designs
  3. Tracking and disturbance rejection
  4. Performance issues; robustness and sensitivity

IV. Optimal Control

  1. Dynamic programming for both discrete-time and continuous-time systems; the Hamilton-Jacobi-Bellman (HJB) equation; relationship between open-loop and closed-loop controllers
  2. Linear-quadratic (LQ) optimal control problem and design of optimum regulators
  3. The matrix Riccati differential equation and some of its properties
  4. The infinite-horizon case: Time-invariant optimal controllers and the algebraic Riccati equation
  5. The minimum principle
  6. Time-optimal control of continuous-time linear systems