Semi-supervised learning in network-structured data via total variation minimization

Alexander Jung, Alfred O. Hero, Alexandru Cristian Mara, Saeed Jahromi, Ayelet Heimowitz, Yonina C. Eldar

Research output: Contribution to journalArticlepeer-review

32 Scopus citations

Abstract

We provide an analysis and interpretation of total variation (TV) minimization for semi-supervised learning from partially-labeled network-structured data. Our approach exploits an intrinsic duality between TV minimization and network flow problems. In particular, we use Fenchel duality to establish a precise equivalence of TV minimization and a minimum cost flow problem. This provides a link between modern convex optimization methods for non-smooth Lasso-Type problems and maximum flow algorithms. We show how a primal-dual method for TV minimization can be interpreted as distributed network optimization. Moreover, we derive a condition on the network structure and available label information that ensures that TV minimization accurately learns (approximately) piece-wise constant graph signals. This condition depends on the existence of sufficiently large network flows between labeled data points. We verify our analysis in numerical experiments.

Original languageEnglish
Article number8902040
Pages (from-to)6256-6269
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume67
Issue number24
DOIs
StatePublished - 15 Dec 2019
Externally publishedYes

Keywords

  • Machine learning
  • big data applications
  • network theory (graphs)
  • optimization
  • semisupervised learning

Fingerprint

Dive into the research topics of 'Semi-supervised learning in network-structured data via total variation minimization'. Together they form a unique fingerprint.

Cite this