Date of Graduation


Document Type


Degree Name

Doctor of Philosophy in Computer Science (PhD)

Degree Level



Computer Science & Computer Engineering


Michael S. Gashler

Committee Member

Wing Ning Li

Second Committee Member

Xintao Wu

Third Committee Member

Giovanni Petris


Activation function, Deep learning, Forecasting, Machine learning, Neural network, Parametric function


The nonlinear activation functions applied by each neuron in a neural network are essential for making neural networks powerful representational models. If these are omitted, even deep neural networks reduce to simple linear regression due to the fact that a linear combination of linear combinations is still a linear combination. In much of the existing literature on neural networks, just one or two activation functions are selected for the entire network, even though the use of heterogenous activation functions has been shown to produce superior results in some cases. Even less often employed are activation functions that can adapt their nonlinearities as network parameters along with standard weights and biases. This dissertation presents a collection of papers that advance the state of heterogenous and parameterized activation functions. Contributions of this dissertation include

  • three novel parametric activation functions and applications of each,
  • a study evaluating the utility of the parameters in parametric activation functions,
  • an aggregated activation approach to modeling time-series data as an alternative to recurrent neural networks, and
  • an improvement upon existing work that aggregates neuron inputs using product instead of sum.