Nearest-Neighbor sample compression: Efficiency, consistency, infinite dimensions

Aryeh Kontorovich, Sivan Sabato, Roi Weiss

نتاج البحث: نشر في مجلةمقالة من مؤنمرمراجعة النظراء

19 اقتباسات (Scopus)

ملخص

We examine the Bayes-consistency of a recently proposed 1-nearest-neighbor-based multiclass learning algorithm. This algorithm is derived from sample compression bounds and enjoys the statistical advantages of tight, fully empirical generalization bounds, as well as the algorithmic advantages of a faster runtime and memory savings. We prove that this algorithm is strongly Bayes-consistent in metric spaces with finite doubling dimension - the first consistency result for an efficient nearest-neighbor sample compression scheme. Rather surprisingly, we discover that this algorithm continues to be Bayes-consistent even in a certain infinite-dimensional setting, in which the basic measure-theoretic conditions on which classic consistency proofs hinge are violated. This is all the more surprising, since it is known that k-NN is not Bayes-consistent in this setting. We pose several challenging open problems for future research.

اللغة الأصليةالإنجليزيّة
الصفحات (من إلى)1574-1584
عدد الصفحات11
دوريةAdvances in Neural Information Processing Systems
مستوى الصوت2017-December
حالة النشرنُشِر - 2017
منشور خارجيًانعم
الحدث31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, الولايات المتّحدة
المدة: ٤ ديسمبر ٢٠١٧٩ ديسمبر ٢٠١٧

بصمة

أدرس بدقة موضوعات البحث “Nearest-Neighbor sample compression: Efficiency, consistency, infinite dimensions'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا