Biography
Before joining KAUST in 2017, Peter Richtárik obtained a Mgr. in Mathematics ('01) at Comenius University in his native Slovakia. In 2007, he received his Ph.D. in Operations Research from Cornell University, U.S., before joining the University of Edinburgh, U.K., in 2009 as an Assistant Professor at the university's School of Mathematics.
The Professor of Computer Science at KAUST is affiliated with the Visual Computing Center and the Extreme Computing Research Center at KAUST.
A number of honors and awards have been conferred on Dr. Richtárik, including the EUSA Award for Best Research or Dissertation Supervisor (Second Prize), 2016; a Turing Fellow Award from the Alan Turing Institute, 2016; and an EPSRC Fellow in Mathematical Sciences, 2016. Before joining KAUST, he was nominated for the Chancellor’s Rising Star Award from the University of Edinburgh in 2014, the Microsoft Research Faculty Fellowship in 2013, and the Innovative Teaching Award from the University of Edinburgh in 2011 and 2012.
Several of his papers attracted international awards, including the SIAM SIGEST Best Paper Award (joint award with Professor Olivier Fercoq); the IMA Leslie Fox Prize (Second prize: M. Takáč 2013, O. Fercoq 2015 and R. M. Gower 2017); and the INFORMS Computing Society Best Student Paper Award (sole runner-up: M. Takáč). Richtárik is the founder and organizer of the "Optimization and Big Data" workshop series. He has given more than 150 research talks at conferences, workshops and seminars worldwide.
He was an Area Chair for ICML 2019 and a Senior Program Committee Member for IJCAI 2019. He is an Associate Editor of Optimization Methods and Software and a Handling Editor of the Journal of Nonsmooth Analysis and Optimization.
Research Interests
Professor Richtárik’s research interests lie at the intersection of mathematics, computer science, machine learning, optimization, numerical linear algebra, high-performance computing and applied probability.
His recent work on randomized optimization algorithms—such as randomized coordinate descent methods, stochastic gradient descent methods, and their numerous extensions, improvements and variants)—has contributed to the foundations and advancement of big data optimization, randomized numerical linear algebra and machine learning.