< Terug naar vorige pagina

Publicatie

Two‐level preconditioning for Ridge Regression

Tijdschriftbijdrage - e-publicatie

Solving linear systems is often the computational bottleneck in real‐life problems. Iterative solvers are the only option due to the complexity of direct algorithms or because the system matrix is not explicitly known. Here, we develop a two‐level preconditioner for regularized least squares linear systems involving a feature or data matrix. Variants of this linear system may appear in machine learning applications, such as ridge regression, logistic regression, support vector machines and Bayesian regression. We use clustering algorithms to create a coarser level that preserves the principal components of the covariance or Gram matrix. This coarser level approximates the dominant eigenvectors and is used to build a subspace preconditioner accelerating the Conjugate Gradient method. We observed speed‐ups for artificial and real‐life data.
Tijdschrift: Numerical Linear Algebra With Applications
ISSN: 1070-5325
Issue: 4
Volume: 28
Jaar van publicatie:2021
BOF-keylabel:ja
IOF-keylabel:ja
BOF-publication weight:3
CSS-citation score:2
Authors from:Higher Education
Toegankelijkheid:Closed