TY - GEN
T1 - Lasso-Kalman smoother for tracking sparse signals
AU - Angelosante, Daniele
AU - Roumeliotis, Stergios I.
AU - Giannakis, Georgios B.
PY - 2009
Y1 - 2009
N2 - Fixed-interval smoothing of time-varying vector processes is an estimation approach with well-documented merits for tracking applications. The optimal performance in the linear Gauss-Markov model is achieved by the Kalman smoother (KS), which also admits an efficient recursive implementation. The present paper deals with vector processes for which it is known a priori that many of their entries equal to zero. In this context, the process to be tracked is sparse, and the performance of sparsity-agnostic KS schemes degrades considerably. On the other hand, it is shown here that a sparsity-aware KS exhibits complexity which grows exponentially in the vector dimension. To obtain a tractable alternative, the KS cost is regularized with the sparsity-promoting l1 norm of the vector process - a relaxation also used in linear regression problems to obtain the least-absolute shrinkage and selection operator (Lasso). The Lasso (L)KS derived in this work is not only capable of tracking sparse time-varying vector processes, but can also afford an efficient recursive implementation based on the alternating direction method of multipliers (ADMoM). Finally, a weighted (W)-LKS is also introduced to cope with the bias of the LKS, and simulations are provided to validate the performance of the novel algorithms
AB - Fixed-interval smoothing of time-varying vector processes is an estimation approach with well-documented merits for tracking applications. The optimal performance in the linear Gauss-Markov model is achieved by the Kalman smoother (KS), which also admits an efficient recursive implementation. The present paper deals with vector processes for which it is known a priori that many of their entries equal to zero. In this context, the process to be tracked is sparse, and the performance of sparsity-agnostic KS schemes degrades considerably. On the other hand, it is shown here that a sparsity-aware KS exhibits complexity which grows exponentially in the vector dimension. To obtain a tractable alternative, the KS cost is regularized with the sparsity-promoting l1 norm of the vector process - a relaxation also used in linear regression problems to obtain the least-absolute shrinkage and selection operator (Lasso). The Lasso (L)KS derived in this work is not only capable of tracking sparse time-varying vector processes, but can also afford an efficient recursive implementation based on the alternating direction method of multipliers (ADMoM). Finally, a weighted (W)-LKS is also introduced to cope with the bias of the LKS, and simulations are provided to validate the performance of the novel algorithms
UR - http://www.scopus.com/inward/record.url?scp=77953862866&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77953862866&partnerID=8YFLogxK
U2 - 10.1109/ACSSC.2009.5470133
DO - 10.1109/ACSSC.2009.5470133
M3 - Conference contribution
AN - SCOPUS:77953862866
SN - 9781424458271
T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers
SP - 181
EP - 185
BT - Conference Record - 43rd Asilomar Conference on Signals, Systems and Computers
T2 - 43rd Asilomar Conference on Signals, Systems and Computers
Y2 - 1 November 2009 through 4 November 2009
ER -