Date of Graduation
5-2018
Document Type
Dissertation
Degree Name
Doctor of Philosophy in Computer Science (PhD)
Degree Level
Graduate
Department
Computer Science & Computer Engineering
Advisor/Mentor
Gashler, Michael S.
Committee Member
Li, Wing Ning
Second Committee Member
Wu, Xintao
Third Committee Member
Petris, Giovanni G.
Keywords
Activation function; Deep learning; Forecasting; Machine learning; Neural network; Parametric function
Abstract
The nonlinear activation functions applied by each neuron in a neural network are essential for making neural networks powerful representational models. If these are omitted, even deep neural networks reduce to simple linear regression due to the fact that a linear combination of linear combinations is still a linear combination. In much of the existing literature on neural networks, just one or two activation functions are selected for the entire network, even though the use of heterogenous activation functions has been shown to produce superior results in some cases. Even less often employed are activation functions that can adapt their nonlinearities as network parameters along with standard weights and biases. This dissertation presents a collection of papers that advance the state of heterogenous and parameterized activation functions. Contributions of this dissertation include
- three novel parametric activation functions and applications of each,
- a study evaluating the utility of the parameters in parametric activation functions,
- an aggregated activation approach to modeling time-series data as an alternative to recurrent neural networks, and
- an improvement upon existing work that aggregates neuron inputs using product instead of sum.
Citation
Godfrey, L. B. (2018). Parameterizing and Aggregating Activation Functions in Deep Neural Networks. Graduate Theses and Dissertations Retrieved from https://scholarworks.uark.edu/etd/2655