Adaptive metric dimensionality reduction

Lee Ad Gottlieb, Aryeh Kontorovich, Robert Krauthgamer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We study data-adaptive dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling, which yields a new theoretical explanation for empirically reported improvements gained by preprocessing Euclidean data by PCA (Principal Components Analysis) prior to constructing a linear classifier. On the algorithmic front, we describe an analogue of PCA for metric spaces, namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.

Original languageEnglish
Title of host publicationAlgorithmic Learning Theory - 24th International Conference, ALT 2013, Proceedings
Pages279-293
Number of pages15
ISBN (Electronic)9783642409356
DOIs
StatePublished - 2013
Event24th International Conference on Algorithmic Learning Theory, ALT 2013 - Singapore, Singapore
Duration: 6 Oct 20139 Oct 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8139 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Algorithmic Learning Theory, ALT 2013
Country/TerritorySingapore
CitySingapore
Period6/10/139/10/13

Fingerprint

Dive into the research topics of 'Adaptive metric dimensionality reduction'. Together they form a unique fingerprint.

Cite this