neural networks research group
areas
people
projects
demos
publications
software/data
Optimizing Neural Networks through Activation Function Discovery and Automatic Weight Initialization (2023)
Garrett Bingham
Automated machine learning (AutoML) methods improve upon existing models by optimizing various aspects of their design. While present methods focus on hyperparameters and neural network topologies, other aspects of neural network design can be optimized as well. To further the state of the art in AutoML, this dissertation introduces techniques for discovering more powerful activation functions and establishing more robust weight initialization for neural networks. These contributions improve performance, but also provide new perspectives on neural network optimization. First, the dissertation demonstrates that discovering solutions specialized to specific architectures and tasks gives better performance than reusing general approaches. Second, it shows that jointly optimizing different components of neural networks is synergistic, and results in better performance than optimizing individual components alone. Third, it demonstrates that learned representations are easier to optimize than hard-coded ones, creating further opportunities for AutoML. The dissertation thus makes concrete progress towards fully automatic machine learning in the future.
View:
PDF
Citation:
PhD Thesis, Department of Computer Science, The University of Texas at Austin, May 2023.
Bibtex:
@phdthesis{bingham:diss23, title={Optimizing Neural Networks through Activation Function Discovery and Automatic Weight Initialization}, author={Garrett Bingham}, month={May}, school={Department of Computer Science, The University of Texas at Austin}, url="http://nn.cs.utexas.edu/?bingham:diss23", year={2023} }
People
Garrett Bingham
Ph.D. Student
bingham [at] cs utexas edu
Areas of Interest
Machine Learning