TY - JOUR
T1 - Adaptive metric dimensionality reduction
AU - Gottlieb, Lee Ad
AU - Kontorovich, Aryeh
AU - Krauthgamer, Robert
N1 - Publisher Copyright:
© 2015 Elsevier B.V.
PY - 2016/3/21
Y1 - 2016/3/21
N2 - We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.
AB - We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.
KW - Dimensionality reduction
KW - Doubling dimension
KW - Metric space
KW - PCA
KW - Rademacher complexity
UR - http://www.scopus.com/inward/record.url?scp=84958259647&partnerID=8YFLogxK
U2 - 10.1016/j.tcs.2015.10.040
DO - 10.1016/j.tcs.2015.10.040
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84958259647
SN - 0304-3975
VL - 620
SP - 105
EP - 118
JO - Theoretical Computer Science
JF - Theoretical Computer Science
ER -