Regularized Evolutionary Population-Based Training (2021)
Metalearning of deep neural network (DNN) architectures and hyperparameters has become an increasingly important area of research. At the same time, network regularization has been recognized as a crucial dimension to effective training of DNNs. However, the role of metalearning in establishing effective regularization has not yet been fully explored. There is recent evidence that loss-function optimization could play this role, however it is computationally impractical as an outer loop to full training. This paper presents an algorithm called Evolutionary Population-Based Training (EPBT) that interleaves the training of a DNN's weights with the metalearning of loss functions. They are parameterized using multivariate Taylor expansions that EPBT can directly optimize. Such simultaneous adaptation of weights and loss functions can be deceptive, and therefore EPBT uses a quality-diversity heuristic called Novelty Pulsation as well as knowledge distillation to prevent overfitting during training. On the CIFAR-10 and SVHN image classification benchmarks, EPBT results in faster, more accurate learning. The discovered hyperparameters adapt to the training process and serve to regularize the learning task by discouraging overfitting to the labels. EPBT thus demonstrates a practical instantiation of regularization metalearning based on simultaneous training.
View:
PDF
Citation:
In Proceedings of the Genetic and Evolutionary Computation Conference, 323-331, 2021.
Bibtex:

Presentation:
Video
Santiago Gonzalez Ph.D. Alumni slgonzalez [at] utexas edu
Jason Zhi Liang Ph.D. Alumni jasonzliang [at] utexas edu
Risto Miikkulainen Faculty risto [at] cs utexas edu
Hormoz Shahrzad Masters Alumni hormoz [at] cognizant com