Semi-supervised learning in network-structured data via total variation minimization

Alexander Jung, Alfred O. Hero, Alexandru Cristian Mara, Saeed Jahromi, Ayelet Heimowitz, Yonina C. Eldar

نتاج البحث: نشر في مجلةمقالةمراجعة النظراء

32 اقتباسات (Scopus)

ملخص

We provide an analysis and interpretation of total variation (TV) minimization for semi-supervised learning from partially-labeled network-structured data. Our approach exploits an intrinsic duality between TV minimization and network flow problems. In particular, we use Fenchel duality to establish a precise equivalence of TV minimization and a minimum cost flow problem. This provides a link between modern convex optimization methods for non-smooth Lasso-Type problems and maximum flow algorithms. We show how a primal-dual method for TV minimization can be interpreted as distributed network optimization. Moreover, we derive a condition on the network structure and available label information that ensures that TV minimization accurately learns (approximately) piece-wise constant graph signals. This condition depends on the existence of sufficiently large network flows between labeled data points. We verify our analysis in numerical experiments.

اللغة الأصليةالإنجليزيّة
رقم المقال8902040
الصفحات (من إلى)6256-6269
عدد الصفحات14
دوريةIEEE Transactions on Signal Processing
مستوى الصوت67
رقم الإصدار24
المعرِّفات الرقمية للأشياء
حالة النشرنُشِر - 15 ديسمبر 2019
منشور خارجيًانعم

بصمة

أدرس بدقة موضوعات البحث “Semi-supervised learning in network-structured data via total variation minimization'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا