Practical Approaches for Efficient Hyperparameter Optimization


Mar 16, 10:00 AM PDT
  • Virtual
  • 244 RSVP
Description
Speaker
Hyperparameters, the tuning knobs of machine learning algorithms, are instrumental for the generation of high-performing models. The tedious task of hyperparameter optimization (HPO) is nonetheless often reduced to manual optimization, humorously called ‘graduate student descent’, or unsophisticated grid search and random search [1], a situation often leading to results that are highly sensitive to hyperparameters.

In this talk, we will walk machine learning practitioners through guidelines for efficient hyperparameter optimization based on Oríon, an open source HPO framework. We will start by presenting practical approaches for the design of the search space, then provide guidelines to select hyperparameter optimization algorithms, and finally demonstrate how to leverage the pioneering Experiment Version Control provided by Oríon for more efficiency.

Yonggang(IBM), Xavier(Mila)

Yonggang Hu
IBM Distinguished Engineer, Applied HPC, AI and Quantum. He has been working on distributed computing, HPC, grid, cloud and big data analytics for the past 20 years. He is Chief Architect of Spectrum Computing products and IBM high performance AI runtime - Watson Machine Learning Accelerator.

Xavier Bouthillier
a PhD student advised by Pascal Vincent at Université de Montréal, Mila. He also works as a research developper in the Innovation, Development and Technology team at Mila, a world renowned research institute in artificial intelligence which rallies more than 500 researchers specializing in the field of deep learning. He is leading the development of the hyperparameter optimization framework Oríon (orion.readthedocs.io). .

The event ended.
Watch Recording
*Recordings hosted on Youtube, click the link will open the Youtube page.
Contact Organizer