Deep learning has seen an explosion of interest since 2012, and since then, deep networks have gotten more complicated and more specialized. How do researchers know which neural network to use for a given dataset? Neural Architecture Search (NAS), which is a subset of hyperparameter optimization, is the process of automating the search to find the best neural architecture for a given dataset.
In this webinar, we will give a survey of the NAS literature, including details for some of the most popular techniques such as reinforcement learning and Bayesian optimization.
Research Scientist at RealityEngines.AI, Colin has been a research scientist at RealityEngines.AI since 2019. He has publications in top machine learning conferences such as NeurIPS, COLT, and AISTATS. Colin received his Ph.D. in computer science from Carnegie Mellon University in 2018, where he was advised by Nina Balcan