Date of Graduation
12-2015
Document Type
Thesis
Degree Name
Master of Science in Computer Science (MS)
Degree Level
Graduate
Department
Computer Science & Computer Engineering
Advisor
Michael Gashler
Committee Member
John Gauch
Second Committee Member
Wing Ning Li
Keywords
Applied sciences, Forward bipartite alignment, Machine learning, Neutral network
Abstract
We present Forward Bipartite Alignment (FBA), a method that aligns the topological structures of two neural networks. Neural networks are considered to be a black box, because neural networks contain complex model surface determined by their weights that combine attributes non-linearly. Two networks that make similar predictions on training data may still generalize differently. FBA enables a diversity of applications, including visualization and canonicalization of neural networks, ensembles, and cross-over between unrelated neural networks in evolutionary optimization. We describe the FBA algorithm, and describe implementations for three applications: genetic algorithms, visualization, and ensembles. We demonstrate FBA's usefulness by comparing a bag of neural networks to a bag of FBA-aligned neural networks. We also show that aligning, and then combining two neural networks has no appreciable loss in accuracy which means that Forward Bipartite Alignment aligns neural networks in a meaningful way.
Citation
Ashmore, S. C. (2015). Evaluating the Intrinsic Similarity between Neural Networks. Graduate Theses and Dissertations Retrieved from https://scholarworks.uark.edu/etd/1395