Evolutionary Neural AutoML for Deep Learning (2019)
Deep neural networks (DNNs) have produced state-of-the-art results in many benchmarks and problem domains. However, the success of DNNs depends on the proper configuration of its architecture and hyperparameters. Such a configuration is difficult and as a result, DNNs are often not used to their full potential. In addition, DNNs in commercial applications often need to satisfy real-world design constraints such as size or number of parameters. To make configuration easier, automatic machine learning (AutoML) systems for deep learning have been developed, focusing mostly on optimization of hyperparameters. This paper takes AutoML a step further. It introduces an evolutionary AutoML framework called LEAF that not only optimizes hyperparameters but also network architectures and the size of the network. LEAF makes use of both state-of-the-art evolutionary algorithms (EAs) and distributed computing frameworks. Experimental results on medical image classification and natural language analysis show that the framework can be used to achieve state-of-the-art performance. In particular, LEAF demonstrates that architecture optimization provides a significant boost over hyperparameter optimization, and that networks can be minimized at the same time with little drop in performance. LEAF therefore forms a foundation for democratizing and improving AI, as well as making AI practical in future applications.
In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2019), 401–409, 2019.

Babak Hodjat Collaborator babak [at] cognizant com
Jason Zhi Liang Ph.D. Alumni jasonzliang [at] utexas edu
Elliot Meyerson Ph.D. Alumni ekm [at] cs utexas edu
Risto Miikkulainen Faculty risto [at] cs utexas edu