Fast Kernel Methods for Generic Lipschitz Losses via p-Sparsified Sketches - Equipe Signal, Statistique et Apprentissage Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Fast Kernel Methods for Generic Lipschitz Losses via p-Sparsified Sketches

Résumé

Kernel methods are learning algorithms that enjoy solid theoretical foundations while suffering from important computational limitations. Sketching, which consists in looking for solutions among a subspace of reduced dimension, is a well studied approach to alleviate these computational burdens. However, statisticallyaccurate sketches, such as the Gaussian one, usually contain few null entries, such that their application to kernel methods and their non-sparse Gram matrices remains slow in practice. In this paper, we show that sparsified Gaussian (and Rademacher) sketches still produce theoretically-valid approximations while allowing for important time and space savings thanks to an efficient decomposition trick. To support our method, we derive excess risk bounds for both single and multiple output kernel problems, with generic Lipschitz losses, hereby providing new guarantees for a wide range of applications, from robust regression to multiple quantile regression. Our theoretical results are complemented with experiments showing the empirical superiority of our approach over SOTA sketching methods.
Fichier principal
Vignette du fichier
Sketching_Kernels_Preprint.pdf (1.02 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04001887 , version 1 (23-02-2023)
hal-04001887 , version 2 (13-11-2023)

Identifiants

Citer

Tamim El Ahmad, Pierre Laforgue, Florence d'Alché-Buc. Fast Kernel Methods for Generic Lipschitz Losses via p-Sparsified Sketches. 2023. ⟨hal-04001887v1⟩
69 Consultations
70 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More