ملخص
We provide an analysis and interpretation of total variation (TV) minimization for semi-supervised learning from partially-labeled network-structured data. Our approach exploits an intrinsic duality between TV minimization and network flow problems. In particular, we use Fenchel duality to establish a precise equivalence of TV minimization and a minimum cost flow problem. This provides a link between modern convex optimization methods for non-smooth Lasso-Type problems and maximum flow algorithms. We show how a primal-dual method for TV minimization can be interpreted as distributed network optimization. Moreover, we derive a condition on the network structure and available label information that ensures that TV minimization accurately learns (approximately) piece-wise constant graph signals. This condition depends on the existence of sufficiently large network flows between labeled data points. We verify our analysis in numerical experiments.
اللغة الأصلية | الإنجليزيّة |
---|---|
رقم المقال | 8902040 |
الصفحات (من إلى) | 6256-6269 |
عدد الصفحات | 14 |
دورية | IEEE Transactions on Signal Processing |
مستوى الصوت | 67 |
رقم الإصدار | 24 |
المعرِّفات الرقمية للأشياء | |
حالة النشر | نُشِر - 15 ديسمبر 2019 |
منشور خارجيًا | نعم |