News
Followings are some recent news about me, important or miscellaneous.
24.05 New paper on minimax optimization and fixed-point problems
Excited to announce the acceptance to ICML 2024 of our paper Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique
!
We discover novel algorithms for fixed-point problems (equivalent to proximal algorithms for monotone inclusion) achieving the exact optimal complexity. We establish a new duality theory between fixed-point algorithms (the H-duality) and exhibit an analogous phenomenon in minimax optimization, together with the continuous-time analysis bridging between the two setups.
23.12 NeurIPS 2023
I am attending NeurIPS to present our paper Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback
.
23.08 Talk at WYMK 2023
I will give an invited talk at Workshop for Young Mathematicians in Korea (WYMK 2023) on sampling algorithms and diffusion models.
23.07 New paper on diffusion models with KRAFTON
Our new paper Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback
has been announced.
This is a joint work with KRAFTON AI Research Center. The paper proposes a methodology to align diffusion models’ sample generation with specified practical requirements, and notably, it is the most empirical paper of mine.
23.06 New ICML Workshop paper
Our paper Diffusion Probabilistic Models Generalize when They Fail to Memorize
has been accepted to ICML 2023 Workshop on Structured Probabilistic Inference & Generative Modeling.
23.06 Talk at SIAM OP23
I will give a talk at 2023 SIAM Conference on Optimization on my paper Accelerated Minimax Algorithms Flock Together
.
22.10 Talk at INFORMS
I will give a talk at 2022 INFORMS Annual Meeting on my papers on minimax optimization.
22.07 Youlchon Fellowship
I have been selected as a recipient of Youlchon AI Star Fellowship. Many thanks!
22.05 New paper on minimax optimization
Our new paper Accelerated Minimax Algorithms Flock Together
has been announced.
It establishes a new connection, the merging path property, among order-optimal algorithms for minimax optimization and order-optimal algorithms for fixed-point problems, which are all based on the mechanism of anchoring ($\approx$ Halpern iteration).
The paper is currently under revision at SIAM Journal on Optimization.
22.02 AISTATS paper
Our paper Robust Probabilistic Time Series Forecasting
has been accepted for publication at AISTATS 2022.
This work is based on my internship project at AWS AI Labs, where I worked with Youngsuk Park and Bernie Wang.
21.05 ~ 21.08 Internship: AWS AI Labs
I will be working at AWS AI Labs as an applied scientist intern.
21.02 ICML Papers
Two papers have been accepted for publication at ICML 2021.
Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with $\mathcal{O}(1/k^2)$ Rate on Squared Gradient Norm
is selected as a long talk (top 166/5513=3% of papers).
It introduces the algorithm Extra Anchored Gradient which reduces the gradient norm of smooth convex-concave objectives with an optimal complexity.
WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points
analyzes setups in which a WGAN objective is favorable to training via alternating gradient ascent-descent.