TY - JOUR
T1 - The Empirical Mean is Minimax Optimal for Local Glivenko-Cantelli
AU - Cohen, Doron
AU - Kontorovich, Aryeh
AU - Weiss, Roi
N1 - Publisher Copyright:
© 2025 by the author(s).
PY - 2025
Y1 - 2025
N2 - We revisit the recently introduced Local Glivenko-Cantelli setting, which studies distributiondependent uniform convergence rates of the EmpiricalMean Estimator (EME). In this work, weinvestigate generalizations of this setting where arbitrary estimators are allowed rather than just the EME. Can a strictly larger class of measures be learned? Can better risk decay rates be obtained? We provide exhaustive answers to these questions—which are both negative, provided the learner is barred from exploiting some infinitedimensional pathologies. On the other hand, allowing such exploits does lead to a strictly larger class of learnable measures.
AB - We revisit the recently introduced Local Glivenko-Cantelli setting, which studies distributiondependent uniform convergence rates of the EmpiricalMean Estimator (EME). In this work, weinvestigate generalizations of this setting where arbitrary estimators are allowed rather than just the EME. Can a strictly larger class of measures be learned? Can better risk decay rates be obtained? We provide exhaustive answers to these questions—which are both negative, provided the learner is barred from exploiting some infinitedimensional pathologies. On the other hand, allowing such exploits does lead to a strictly larger class of learnable measures.
UR - https://www.scopus.com/pages/publications/105023565158
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.conferencearticle???
AN - SCOPUS:105023565158
SN - 2640-3498
VL - 267
SP - 11173
EP - 11184
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 42nd International Conference on Machine Learning, ICML 2025
Y2 - 13 July 2025 through 19 July 2025
ER -