TY - GEN

T1 - Universal Bayes Consistency in Metric Spaces

AU - Hanneke, Steve

AU - Kontorovich, Aryeh

AU - Sabato, Sivan

AU - Weiss, Roi

N1 - Publisher Copyright:
© 2020 IEEE.

PY - 2020/2/2

Y1 - 2020/2/2

N2 - We show that a recently proposed 1-nearest-neighbor-based multiclass learning algorithm is universally strongly Bayes consistent in all metric spaces where such Bayes consistency is possible, making it an "optimistically universal"Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, k-NN and its variants are not generally universally Bayes consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property.The metric spaces in which universal Bayes consistency is possible are the "essentially separable"ones-a new notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency.Taken together, these positive and negative results resolve the open problems posed in Kontorovich, Sabato, Weiss (2017).

AB - We show that a recently proposed 1-nearest-neighbor-based multiclass learning algorithm is universally strongly Bayes consistent in all metric spaces where such Bayes consistency is possible, making it an "optimistically universal"Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, k-NN and its variants are not generally universally Bayes consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property.The metric spaces in which universal Bayes consistency is possible are the "essentially separable"ones-a new notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency.Taken together, these positive and negative results resolve the open problems posed in Kontorovich, Sabato, Weiss (2017).

KW - Bayes consistency

KW - classification

KW - metric space

KW - nearest neighbor

UR - http://www.scopus.com/inward/record.url?scp=85097351635&partnerID=8YFLogxK

U2 - 10.1109/ITA50056.2020.9244988

DO - 10.1109/ITA50056.2020.9244988

M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???

AN - SCOPUS:85097351635

T3 - 2020 Information Theory and Applications Workshop, ITA 2020

BT - 2020 Information Theory and Applications Workshop, ITA 2020

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2020 Information Theory and Applications Workshop, ITA 2020

Y2 - 2 February 2020 through 7 February 2020

ER -