One of the primary challenges of system identification is determining how much data is necessary to adequately fit a model. Non-asymptotic characterizations of the performance of system identification methods provide this knowledge. Such characterizations are available for several algorithms performing open-loop identification. Often times, however, data is collected in closed-loop. Application of open-loop identification methods to closed-loop data can result in biased estimates. One method to eliminate these biases involves first fitting a long-horizon autoregressive model and then performing model reduction. The asymptotic behavior of such algorithms is well characterized, but the non-asymptotic behavior is not. This work provides a non-asymptotic characterization of one particular variant of these algorithms. More specifically, we provide non-asymptotic upper bounds on the generalization error of the produced model, as well as high probability bounds on the difference between the produced model and the finite horizon Kalman Filter.
|Original language||English (US)|
|Title of host publication||2020 59th IEEE Conference on Decision and Control, CDC 2020|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||6|
|State||Published - Dec 14 2020|
|Event||59th IEEE Conference on Decision and Control, CDC 2020 - Virtual, Jeju Island, Korea, Republic of|
Duration: Dec 14 2020 → Dec 18 2020
|Name||Proceedings of the IEEE Conference on Decision and Control|
|Conference||59th IEEE Conference on Decision and Control, CDC 2020|
|Country||Korea, Republic of|
|City||Virtual, Jeju Island|
|Period||12/14/20 → 12/18/20|
Bibliographical noteFunding Information:
This work was supported in part by NSF CMMI-1727096 B.L. is a graduate student at the University of Pennsylvania A.L. is with the department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN 55455, USA
© 2020 IEEE.