neural networks research group
areas
people
projects
demos
publications
software/data
Improving Neural Network Learning Through Dual Variable Learning Rates (2021)
Elizabeth Liner,
Risto Miikkulainen
This paper introduces and evaluates a novel training method for neural networks: Dual Variable Learning Rates (DVLR). Building on insights from behavioral psychology, the dual learning rates are used to emphasize correct and incorrect responses differently, thereby making the feedback to the network more specific. Further, the learning rates are varied as a function of the network's performance, thereby making it more efficient. DVLR was implemented on three types of networks: feedforward, convolutional, and residual, and two domains: MNIST and CIFAR-10. The results suggest a consistently improved accuracy, demonstrating that DVLR is a promising, psychologically motivated technique for training neural network models.
View:
PDF
Citation:
In
Proceedings of the International Joint Conference on Neural Networks
, 2021.
Bibtex:
@inproceedings{liner:ijcnn21, title={Improving Neural Network Learning Through Dual Variable Learning Rates}, author={Elizabeth Liner and Risto Miikkulainen}, booktitle={Proceedings of the International Joint Conference on Neural Networks}, month={ }, url="http://nn.cs.utexas.edu/?liner:ijcnn21", year={2021} }
People
Risto Miikkulainen
Faculty
risto [at] cs utexas edu
Areas of Interest
Supervised Learning
Cognitive Science