4 papers accepted to NeurIPS 2020 Workshops, 1 Spotlight and 2 Oral Presentations

Our paper Optimal Client Sampling for Federated Learning, joint work with Wenlin Chen and Peter Richtarik, was accepted to Privacy Preserving Machine Learning workshop. In addition, our work Adaptivity of Stochhastic Gradient Methods for Nonconvex Optimization, joint work with Michael I. Jordan, Lihua Lei, and Peter Richtarik, was accepted to Optimization for Machine Learning workshop as a Spotlight talk. Lastly, A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning, joint work with Peter Richtarik, and On Biased Compression for Distributed Learning, joint work with Aleksandr Beznosikov, Mher Safaryan and Peter Richtarik, were selected as contributed talks at Workshop on Scalability, Privacy, and Security in Federated Learning.


UPDATE

A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning won the Best Paper Award at NeurIPS -SpicyFL 2020 - NeurIPS-20 Workshop on Scalability, Privacy, and Security in Federated Learning.

Samuel Horvath
Samuel Horvath
PhD Student

PhD Student in Machine Learning and Optimization