An accelerated IRNN-Iteratively Reweighted Nuclear Norm algorithm for nonconvex nonsmooth low-rank minimization problems

Abstract

Low-rank minimization problems arise in numerous important applications such as recommendation systems, machine learning, network analysis, and so on. The problems however typically consist of minimizing a sum of a smooth function and nonconvex nonsmooth composite functions, which solving them remains a big challenge. In this paper, we take inspiration from the Nesterov’s acceleration technique to accelerate an iteratively reweighted nuclear norm algorithm for the considered problems ensuring that every limit point is a critical point. Our algorithm iteratively computes the proximal operator of a reweighted nuclear norm which has a closed-form solution by performing the SVD of a smaller matrix instead of the full SVD. This distinguishes our work from recent accelerated proximal gradient algorithms that require an expensive computation of the proximal operator of nonconvex nonsmooth composite functions. We also investigate the convergence rate with the Kurdyka–Łojasiewicz assumption. Numerical experiments are performed to demonstrate the efficiency of our algorithm and its superiority over well-known methods.

Publication
Journal of Computational and Applied Mathematics, 396