The Empirical Mean is Minimax Optimal for Local Glivenko-Cantelli

Research output: Contribution to journalConference articlepeer-review

Abstract

We revisit the recently introduced Local Glivenko-Cantelli setting, which studies distributiondependent uniform convergence rates of the EmpiricalMean Estimator (EME). In this work, weinvestigate generalizations of this setting where arbitrary estimators are allowed rather than just the EME. Can a strictly larger class of measures be learned? Can better risk decay rates be obtained? We provide exhaustive answers to these questions—which are both negative, provided the learner is barred from exploiting some infinitedimensional pathologies. On the other hand, allowing such exploits does lead to a strictly larger class of learnable measures.

Original languageEnglish
Pages (from-to)11173-11184
Number of pages12
JournalProceedings of Machine Learning Research
Volume267
StatePublished - 2025
Event42nd International Conference on Machine Learning, ICML 2025 - Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025

Fingerprint

Dive into the research topics of 'The Empirical Mean is Minimax Optimal for Local Glivenko-Cantelli'. Together they form a unique fingerprint.

Cite this