Abstract
We provide an analysis and interpretation of total variation (TV) minimization for semi-supervised learning from partially-labeled network-structured data. Our approach exploits an intrinsic duality between TV minimization and network flow problems. In particular, we use Fenchel duality to establish a precise equivalence of TV minimization and a minimum cost flow problem. This provides a link between modern convex optimization methods for non-smooth Lasso-Type problems and maximum flow algorithms. We show how a primal-dual method for TV minimization can be interpreted as distributed network optimization. Moreover, we derive a condition on the network structure and available label information that ensures that TV minimization accurately learns (approximately) piece-wise constant graph signals. This condition depends on the existence of sufficiently large network flows between labeled data points. We verify our analysis in numerical experiments.
Original language | English |
---|---|
Article number | 8902040 |
Pages (from-to) | 6256-6269 |
Number of pages | 14 |
Journal | IEEE Transactions on Signal Processing |
Volume | 67 |
Issue number | 24 |
DOIs | |
State | Published - 15 Dec 2019 |
Externally published | Yes |
Keywords
- Machine learning
- big data applications
- network theory (graphs)
- optimization
- semisupervised learning