On L0 Bregman-Relaxations for Kullback-Leibler Sparse Regression
Résumé
The resolution of optimization problems involving the ℓ0 pseudo-norm has proven to be of importance in signal processing and machine learning applications for selecting relevant variables. Among the vast class of existing approaches dealing with the intrinsic NP-hardness of such problems, continuous (possibly non-convex) relaxations have been increasingly considered over the recent years. The notion of ℓ0-Bregman relaxation (B-rex) has been recently introduced to construct effective relaxations of ℓ0-regularized objectives with general data terms. These relaxations are termed exact in the sense that they preserve the global minimizers while removing some local minimizers. In this study, we deepen this idea further for ℓ0-regularized Kullback-Leibler regression problems, designing a tailored B-rex. Compared to other relaxations, it further reduces the number of local minimizers of the original problem by means of a suitable analytical/geometrical modeling. To better exploit the geometry of the relaxed problem, we deploy a dedicated Bregman proximal gradient algorithm for its minimization.
Fichier principal
AUTHOR_GUIDELINES_FOR_MLSP_PROCEEDINGS_MANUSCRIPTS_Camera_ready-1.pdf (732.09 Ko)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|