The Challenge: Modern machine learning requires training on massive datasets across distributed systems, where communication costs often dominate computational costs and can limit scalability.
My Approach: I develop methods that integrate communication compression with error compensation mechanisms to mitigate communication bottlenecks while preserving convergence guarantees.
Key Contributions:
Impact: The EF21 line of work is now a foundation for communication-efficient federated learning systems.
EF21: A New, Simpler, Theoretically Better, and Practically Faster Error Feedback. with P. Richtárik, I. Sokolov. NeurIPS (Oral Presentation, top 1%), 2021.
EF21 with Bells & Whistles: Six Algorithmic Extensions of Modern Error Feedback. with I. Sokolov, E. Gorbunov, Z. Li, P. Richtárik. Journal of Machine Learning Research, 2025.
Momentum Provably Improves Error Feedback! with A. Tyurin, P. Richtárik. NeurIPS, 2023.
Safe-EF: Error Feedback for Nonsmooth Constrained Optimization. with R. Islamov, Y. As. ICML, 2025.
The EF21 algorithm and its extensions have become fundamental building blocks for communication-efficient distributed machine learning, enabling large-scale training while maintaining theoretical guarantees and practical performance.