We consider the problem of modeling data matrices with locally low rank (LLR) structure, a generalization of the popular low rank structure widely used in a variety of real world application domains ranging from medical imaging to recommendation systems. While LLR modeling has been found to be promising in real world application domains, limited progress has been made on the design of scalable algorithms for such structures. In this paper, we consider a convex relaxation of LLR structure, and propose an efficient algorithm based on dual projected gradient descent (D-PGD) for computing the proximal operator. While the original problem is non-smooth, so that primal (sub)gradient algorithms will be slow, we show that the proposed D-PGD algorithm has geometrical convergence rate. We present several practical ways to further speed up the computations, including acceleration and approximate SVD computations. With experiments on both synthetic and real data from MRI (magnetic resonance imaging) denoising, we illustrate the superior performance of the proposed D-PGD algorithm compared to several baselines.
|Original language||English (US)|
|Title of host publication||Proceedings - 17th IEEE International Conference on Data Mining, ICDM 2017|
|Editors||George Karypis, Srinivas Alu, Vijay Raghavan, Xindong Wu, Lucio Miele|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||10|
|State||Published - Dec 15 2017|
|Event||17th IEEE International Conference on Data Mining, ICDM 2017 - New Orleans, United States|
Duration: Nov 18 2017 → Nov 21 2017
|Name||Proceedings - IEEE International Conference on Data Mining, ICDM|
|Other||17th IEEE International Conference on Data Mining, ICDM 2017|
|Period||11/18/17 → 11/21/17|
Bibliographical noteFunding Information:
ACKNOWLEDGMENTS The research was supported by NSF grants IIS-1447566, IIS-1422557, CCF-1451986, CNS-1314560, IIS-0953274, IIS-1029711, NASA grant NNX12AQ39A, NSF CCF:CIF:Small:1318347 and “Mayo Clinic Discovery Translation Program”.
© 2017 IEEE.