[Seminar] From Gradient-Free Federation to Leveraging Deep Learning Geometry

[Seminar] From Gradient-Free Federation to Leveraging Deep Learning Geometry
Wednesday March 11th, 2026 03:00 PM to 04:30 PM
Seminar Room L4E48

Description

Assistant Professor Mirko Polato

 University of Turin, Department of Computer Science 

 

Abstract: Federated learning is typically built around gradient exchange and parameter averaging, yet collaboration does not have to rely on gradients alone. In the first part of this talk, I explore gradient-free approaches to federation, including federated boosting and Support Vector Federation, where models are aggregated in function space or through perturbed support vectors rather than shared weights. I also discuss margin-promoting objectives that reshape local optimization to reduce client drift and improve stability under non-i.i.d. data.

 

The second (very short) part shifts to deep representation learning and focuses on Neural Collapse, a geometric regularity emerging late in training. Rather than treating it as a byproduct of optimization, we use NC-related metrics as training-time signals to identify when and where networks can be simplified without loss of performance.

Add Event to My Calendar

Subscribe to the OIST Calendar

See OIST events in your calendar app