Congratulations to Ruiyuan Lin for Passing Her Defense
Let us hear what she has to say about her defense and an abstract of her thesis.
Neural networks have been shown to be effective in many applications. To better explain the behaviors of the neural networks, we examine the properties of neural networks experimentally and analytically in this research.
In the first part, we conduct experiments on convolutional neural networks (CNNs) and observe the behaviors of the networks. We also propose some insight or conjectures for CNNs in this part. First, we demonstrate how the accuracy changes with the size of the convolutional layers. Second, we develop a design to determine the size of the convolutional layers based on SSL. Third, as a case study, we analyze the SqueezeNet, which is able to achieve the same accuracy as AlexNet with 50x fewer parameters, by studying the evolution of cross-entropy values across layers and doing visualization. Fourth, we also propose some insight on co-training based deep semi-supervised learning.
In the second part, we propose new angles to understand and interpret neural network. To understand the behaviors of multilayer perceptrons (MLPs) as classifiers, we interpret MLPs as a generalization of a two-class LDA system so that it can handle an input composed by multiple Gaussian modalities belonging to multiple classes. An MLP design with two hidden layers that also specifies the filter weights is proposed. To understand the behaviors of multilayer perceptrons (MLPs) as a regressor we construct MLPs as a piecewise low-order polynomial approximator using a signal processing approach. The constructed MLP contains one input, one intermediate and one output layers. Its construction includes the specification of neuron numbers and all filter weights. Through the construction, a one-to-one correspondence between the approximation of an MLP and that [...]