ENHANCING NEURAL TRAINING VIA A CORRELATED DYNAMICS MODEL

Jonathan Brokman, Roy Betser, Rotem Turjeman, Tom Berkov, Ido Cohen, Guy Gilboa

פרסום מחקרי: תוצר מחקר מכנסהרצאהביקורת עמיתים

2 ציטוטים ‏(Scopus)

תקציר

As neural networks grow in scale, their training becomes both computationally demanding and rich in dynamics. Amidst the flourishing interest in these training dynamics, we present a novel observation: Parameters during training exhibit intrinsic correlations over time. Capitalizing on this, we introduce correlation mode decomposition (CMD). This algorithm clusters the parameter space into groups, termed modes, that display synchronized behavior across epochs. This enables CMD to efficiently represent the training dynamics of complex networks, like ResNets and Transformers, using only a few modes. Moreover, test set generalization is enhanced. We introduce an efficient CMD variant, designed to run concurrently with training. Our experiments indicate that CMD surpasses the state-of-the-art method for compactly modeled dynamics on image classification. Our modeling can improve training efficiency and lower communication overhead, as shown by our preliminary experiments in the context of federated learning.

שפה מקוריתאנגלית
סטטוס פרסוםפורסם - 2024
אירוע12th International Conference on Learning Representations, ICLR 2024 - Hybrid, Vienna, אוסטריה
משך הזמן: 7 מאי 202411 מאי 2024

כנס

כנס12th International Conference on Learning Representations, ICLR 2024
מדינה/אזוראוסטריה
עירHybrid, Vienna
תקופה7/05/2411/05/24

טביעת אצבע

להלן מוצגים תחומי המחקר של הפרסום 'ENHANCING NEURAL TRAINING VIA A CORRELATED DYNAMICS MODEL'. יחד הם יוצרים טביעת אצבע ייחודית.

פורמט ציטוט ביבליוגרפי