< Back to previous page

Publication

Multiple Kernel Learning via Multi-Epochs SVRG

Book Contribution - Book Chapter Conference Contribution

This work proposes a multiple kernel learning (MKL) descent strategy based on multiple epochs of stochastic variance reduced gradients (i.e. multi-epochs SVRG). The proposed descent strategy takes place with a constant-size learning step, that is entangled to the kernel's combination coefficients evolution, and hence corrected in between epochs. This descending regime leads to an improved MKL bound that exhibit a linear dependency in the number of samples n, and sub-linear one in both the number of kernels F and precision of the solution e. In particular, for an Lp-norm MKL, the proposed method is able to find an e-accurate solution in a complexity O( F^(1/q) n log(1/e)). This matches the optimal convergence rate reported for (non-accelerated) strongly-convex objectives and improves over other state-of-the-art MKL solutions.
Book: 9th NIPS Workshop on Optimization for Machine Learning
Volume: 12
Number of pages: 5
Publication year:2016
Accessibility:Open