332:505 Control Theory I

Syllabus

Chapter and section numbers correspond to the text,   Systems and Controls by S. Zak, Oxford University Press, 2003.

  1. Analysis of Modeling Equations   (one to two lectures)
    1. State-Plane (Phase-Plane) Analysis
      1. Examples of Phase-Portraits
      2. The Method of Isoclines
    2. Numerical Techniques
      1. The Method of Taylor Series
      2. Euler's Methods
      3. Predictor-Corrector Method
      4. Runge's Method
      5. Runge-Kutta Method
    3. Principles of Linearization
    4. Linearizing Differential Equations
    5. Describing Function Method
      1. Scalar Product of Functions
      2. Fourier Series
      3. Describing Function in the Analysis of Nonlinear Systems

  2. Linear Systems   (one to two lectures)
    1. Reachability and Controllability
    2. Observability and Constructibility
    3. Companion Forms
      1. Controller Form
      2. Observer Form
    4. Linear State-Feedback Control
    5. State Estimators
      1. Full-Order Estimator
      2. Reduced-Order Estimator
    6. Combined Controller-Estimator Compensator

  3. Stability   (two to three lectures)
    1. Informal Introduction to Stability
    2. Basic Definitions of Stability
    3. Stability of Linear Systems
    4. Evaluating Quadratic Indices
    5. Discrete-Time Lyapunov Equation
    6. Constructing Robust Linear Controllers
    7. Hurwitz and Routh Stability Criteria
    8. Stability of Nonlinear Systems
    9. Lyapunov's Indirect Method
    10. Discontinuous Robust Controllers
    11. Uniform Ultimate Boundedness
    12. Lyapunov-Like Analysis
    13. LaSalle's Invariance Principle
Midterm Exam, Lecture 7, March 5, 2003, will cover lectures 1 through 6.
  1. Optimal Control   (four lectures)
    1. Performance Indices
    2. A Glimpse at the Calculus of Variations
      1. Variation and Its Properties
      2. Euler-Lagrange Equation
    3. Linear Quadratic Regulator
      1. Algebraic Riccati Equation (ARE)
      2. Solving the ARE Using the Eigenvector Method
      3. Optimal Systems with Prescribed Poles
      4. Optimal Saturating Controllers
      5. Linear Quadratic Regulator for Discrete Systems on an Infinite
      6. Time Interval
    4. Dynamic Programming
      1. Discrete-Time Systems
      2. Discrete Linear Quadratic Regulator Problem
      3. Continuous Minimum Time Regulator Problem
      4. The Hamilton-Jacobi-Bellman Equation
    5. Pontryagin's Minimum Principle
      1. Optimal Control with Constraints on Inputs
      2. A Two-Point Boundary-Value Problem

  2. Sliding Modes   (one to three lectures if time permits)
    1. Simple Variable Structure Systems
    2. Sliding Mode Definition
    3. A Simple Sliding Mode Controller
    4. Sliding in Multi-Input Systems
    5. Sliding Modes and System Zeros
    6. Nonideal Sliding Mode
    7. Sliding Surface Design
    8. State Estimation of Uncertain Systems
      1. Discontinuous Estimators
      2. Boundary Layer Estimators
    9. Sliding Modes in Solving Optimization Problems
      1. Optimization Problem Statement
      2. Penalty Function Method
      3. Dynamical Gradient Circuit Analysis
Final Exam will cover lectures 8 through 14.

Printable copy of syllabus

Back to course information
Page last modified 07/18/07.