neural networks research group
areas
people
projects
demos
publications
software/data
Evolutionary Optimization of Deep Learning Activation Functions (2020)
Garrett Bingham
, William Macke, and
Risto Miikkulainen
The choice of activation function can have a large effect on the performance of a neural network. While there have been some attempts to hand-engineer novel activation functions, the Rectified Linear Unit (ReLU) remains the most commonly-used in practice. This paper shows that evolutionary algorithms can discover novel activation functions that outperform ReLU. A tree-based search space of candidate activation functions is defined and explored with mutation, crossover, and exhaustive search. Experiments on training wide residual networks on the CIFAR-10 and CIFAR-100 image datasets show that this approach is effective. Replacing ReLU with evolved activation functions results in statistically significant increases in network accuracy. Optimal performance is achieved when evolution is allowed to customize activation functions to a particular task; however, these novel activation functions are shown to generalize, achieving high performance across tasks. Evolutionary optimization of activation functions is therefore a promising new dimension of metalearning in neural networks.
View:
PDF
Citation:
In
Genetic and Evolutionary Computation Conference (GECCO '20)
, 289-296, Cancun, Mexico, 2020.
Bibtex:
@inproceedings{bingham:gecco20, title={Evolutionary Optimization of Deep Learning Activation Functions}, author={Garrett Bingham and William Macke and Risto Miikkulainen}, booktitle={Genetic and Evolutionary Computation Conference (GECCO '20)}, month={ }, address={Cancun, Mexico}, pages={289-296}, url="http://nn.cs.utexas.edu/?bingham:gecco20", year={2020} }
Presentation:
Video
People
Garrett Bingham
Ph.D. Student
bingham [at] cs utexas edu
Risto Miikkulainen
Faculty
risto [at] cs utexas edu
Areas of Interest
Supervised Learning
Evolutionary Computation
Neuroevolution