Inspires students to love their studies.
Nicolas Loizou is an Assistant Professor in the Department of Applied Mathematics and Statistics and the Mathematical Institute for Data Science at Johns Hopkins University, with secondary appointments in the Department of Computer Science and the Department of Electrical and Computer Engineering. He is affiliated with the JHU Machine Learning Group, Data Science and AI Institute, and Ralph O’Connor Sustainable Energy Institute. Loizou earned his PhD in Optimization and Operational Research from the School of Mathematics at the University of Edinburgh in 2019, advised by Peter Richtárik, an MSc in Computing from Imperial College London in 2015, and a BSc in Mathematics from the National and Kapodistrian University of Athens in 2014. Prior to joining JHU in January 2022, he was an IVADO Postdoctoral Research Fellow at Mila – Quebec Artificial Intelligence Institute and Université de Montréal from September 2019 to December 2021, hosted by Simon Lacoste-Julien and Ioannis Mitliagkas. He also interned at Meta Fundamental AI Research in Montreal in 2018.
His research interests encompass large-scale optimization, machine learning, randomized numerical linear algebra, distributed and decentralized algorithms, algorithmic game theory, multi-agent learning, and federated learning. Loizou directs the Optimization and Machine Learning Lab at JHU, focusing on the theory and applications of convex and non-convex optimization in machine learning and data science. Notable awards include the 2020 COAP Best Paper Prize for “Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods,” runner-up for the 2019 OR Society’s Doctoral Award, IVADO Fellowship, and the 2025 Johns Hopkins Catalyst Award. Key publications feature “Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization” (2023, Journal of Optimization Theory and Applications), “Locally Adaptive Federated Learning” (2024, Transactions on Machine Learning Research), “Revisiting Randomized Gossip Algorithms: General Framework, Convergence Rates and Novel Block and Accelerated Protocols” (2021, IEEE Transactions on Information Theory), “Convergence Analysis of Inexact Randomized Iterative Methods” (2020, SIAM Journal on Scientific Computing), and “Stochastic Polyak Step-sizes and Momentum: Convergence Guarantees and Practical Performance” (2025, ICLR). He serves as Associate Editor for Information and Inference: a Journal of the IMA, Area Chair for ICML 2024–2025 and AISTATS 2025, and has organized mini-symposia on optimization for machine learning at SIAM and ISMP conferences, while chairing JHU committees and reviewing NSF and DOE grants.