Date of Graduation
Doctor of Philosophy in Computer Science (PhD)
Computer Science & Computer Engineering
Michael S. Gashler
Wing Ning Li
Second Committee Member
Third Committee Member
Activation function, Deep learning, Forecasting, Machine learning, Neural network, Parametric function
The nonlinear activation functions applied by each neuron in a neural network are essential for making neural networks powerful representational models. If these are omitted, even deep neural networks reduce to simple linear regression due to the fact that a linear combination of linear combinations is still a linear combination. In much of the existing literature on neural networks, just one or two activation functions are selected for the entire network, even though the use of heterogenous activation functions has been shown to produce superior results in some cases. Even less often employed are activation functions that can adapt their nonlinearities as network parameters along with standard weights and biases. This dissertation presents a collection of papers that advance the state of heterogenous and parameterized activation functions. Contributions of this dissertation include
- three novel parametric activation functions and applications of each,
- a study evaluating the utility of the parameters in parametric activation functions,
- an aggregated activation approach to modeling time-series data as an alternative to recurrent neural networks, and
- an improvement upon existing work that aggregates neuron inputs using product instead of sum.
Godfrey, Luke Benjamin, "Parameterizing and Aggregating Activation Functions in Deep Neural Networks" (2018). Theses and Dissertations. 2655.