Deep learning has seen an explosion of interest since 2012, and since then, deep networks have gotten more complicated and more specialized. How do researchers know which neural network to use for a given dataset? Neural Architecture Search (NAS), which is a subset of hyperparameter optimization, is the process of automating the search to find the best neural architecture for a given dataset.
In this webinar, we will give a survey of the NAS literature, including details for some of the most popular techniques such as reinforcement learning and Bayesian optimization.