Online Hyperparameter Search Interleaved with Proximal Parameter Updates
Chapter
Accepted version
Permanent lenke
https://hdl.handle.net/11250/3131545Utgivelsesdato
2020Metadata
Vis full innførselSamlinger
Originalversjon
Lopez-Ramos, L. M. & Beferull-Lozano, B. (2020). Online Hyperparameter Search Interleaved with Proximal Parameter Updates. European Signal Processing Conference, 2085-2089 https://doi.org/10.23919/Eusipco47968.2020.9287537Sammendrag
There is a clear need for efficient hyperparameter optimization (HO) algorithms for statistical learning, since commonly applied search methods (such as grid search with N-fold cross-validation) are inefficient and/or approximate. Previously existing gradient-based HO algorithms that rely on the smoothness of the cost function cannot be applied in problems such as Lasso regression. In this contribution, we develop a HO method that relies on the structure of proximal gradient methods and does not require a smooth cost function. Such a method is applied to Leave-one-out (LOO)-validated Lasso and Group Lasso, and an online variant is proposed. Numerical experiments corroborate the convergence of the proposed methods to stationary points of the LOO validation error curve, and the improved efficiency and stability of the online algorithm
Beskrivelse
Author's accepted manuscript.
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.