We present an algorithm for computing the l∞-induced norm of linear systems described by time-varying difference equations with uncertain system matrices. The approach is based on ideas from convex analysis. The convergence of the algorithm is studied, and vertex-type results related to the stabilization and L1-control of uncertain discrete-time systems are established.
|Original language||English (US)|
|Number of pages||5|
|Journal||Proceedings of the American Control Conference|
|State||Published - Jan 1 1995|