Global Optimization with Parametric Function Approximation



Chong Liu (UC Santa Barbara)

Chong Liu is a Ph.D. candidate in Computer Science at the University of California, Santa Barbara. His research interests include machine learning and AI for science, with emphasis on global optimization, bandits, active learning, and experimental design. He is an editorial board reviewer of JMLR and serves on program committees of several conferences, including AAAI, AISTATS, ICML, KDD, and NeurIPS. Part of his research has been deployed at Amazon.



Short Abstract: To build tough materials, scientists need to sequentially select configurations ahead of time and then conduct expensive experiments to calculate its formation energy. To tune hyperparameters of deep learning, engineers need to carefully decide hyperparameters for training. However, in both cases, people cannot observe performances of unselected parameters and the experimental cost can be huge. These two challenges hinder new material design and hyperparameter tuning, and call for our actions. Existing work usually models this kind of problem as black-box optimization and relies on Gaussian processes or other non-parametric family, which suffers from the curse of dimensionality. In this talk, I will present my research on solving black-box optimization with parametric functions where parametric functions can be deep neural networks. Under a realizable assumption and a few other mild geometric conditions, the new GO-UCB algorithm achieves a sublinear cumulative regret. At the core of GO-UCB is a carefully designed uncertainty set over parameters based on gradients that allows optimistic exploration. Synthetic and real-world experiments illustrate GO-UCB works better than existing approaches in high dimensions, even if the model is misspecified. I'll also include some future directions in the end. Reference: https://arxiv.org/pdf/2211.09100.pdf.