< Terug naar vorige pagina

Publicatie

Iterative nearest neighbors for classification and dimensionality reduction

Boekbijdrage - Boekhoofdstuk Conferentiebijdrage

Representative data in terms of a set of selected samples is of interest for various machine learning applications, e.g. dimensionality reduction and classification. The best-known techniques probably still are k-Nearest Neighbors (kNN) and its variants. Recently, richer representations have become popular. Examples are methods based on l1 -regularized least squares (Sparse Representation (SR)) or l2 -regularized least squares (Collaborative Representation (CR)), or on l1 -constrained least squares (Local Linear Embedding (LLE)). We propose Iterative Nearest Neighbors (INN). This is a novel sparse representation that combines the power of SR and LLE with the computational simplicity of kNN. We test our method in terms of dimensionality reduction and classification, using standard benchmarks such as faces (AR), traffic signs (GTSRB), and PASCAL VOC 2007. INN performs better than NN and comparable with CR and SR, while being orders of magnitude faster than the latter.
Boek: Proceedings CVPR 2012
Pagina's: 2456 - 2463
ISBN:978-1-4673-1228-8
Jaar van publicatie:2012
BOF-keylabel:ja
IOF-keylabel:ja
Authors from:Higher Education
Toegankelijkheid:Closed