Population-Based Training for Loss Function Optimization (2020)
Metalearning of deep neural network (DNN) architectures and hyperparameters has become an increasingly important area of research. Loss functions are a type of metaknowledge that is crucial to effective training of DNNs and their potential role in metalearning has not yet been fully explored. This paper presents an algorithm called Enhanced Population-Based Training (EPBT) that interleaves the training of a DNN's weights with the metalearning of optimal hyperparameters and loss functions. Loss functions use a TaylorGLO parameterization, based on multivariate Taylor expansions, that EPBT can directly optimize. On the CIFAR-10 and SVHN image classification benchmarks, EPBT discovers loss function schedules that enable faster, more accurate learning. The discovered functions adapt to the training process and serve to regularize the learning task by discouraging overfitting to the labels. EPBT thus demonstrates a promising synergy of simultaneous training and metalearning.
To Appear In arXiv:2002.04225, February 2020.

Santiago Gonzalez Ph.D. Student slgonzalez [at] utexas edu
Jason Zhi Liang Ph.D. Alumni jasonzliang [at] utexas edu
Risto Miikkulainen Faculty risto [at] cs utexas edu