The lecture will be held online.
Abstract of talk:
We take a brief look at the recent advances in the computational tools for data science, especially mathematical optimization. In particular, we review the highly successful methodology of incremental optimization, which has become an indispensable element of large-scale data analysis. We present a unifying frameworks for extending a certain family of incremental techniques, known by the common term "stochastic averaging", for non-differentiable and/or constrained problems, and present recent theoretical findings in this regard. We also present the application of this framework to the celebrated vector clustering problem, and further point out to its potential in distributed optimization problems, especially distributed machine learning.
Ashkan Panahi is an assistant professor at the Computer Science and Engineering Department at Chalmers University of Technology, Sweden. He received his BSc and MSc degrees in electrical and communication systems engineering from Iran University of Science and Technology (IUST) (2008) and Chalmers University (2010). He also received his PhD degree in Signal Processing from the Electrical Engineering Department at Chalmers (2015). He has held multiple other research positions such as visiting research student at California Institute of Technology (Caltech 2014), U.S. National Research Council Research Associate (2016-2018) and Postdoctoral Researcher at North Carolina State University (2018-2019). His research interest spans a broad range of topics in machine learning and data science, including optimization algorithms, statistical analysis and probabilistic methods, compressed sensing, and statistical detection and estimation theory.
To attend the lecture, join in via the Zoom link above.
26 May, 2020, 10:30
26 May, 2020, 11:30