Abstract
-
This paper introduces a fully recursive perceptron network (FRPN) architecture as an alternative to multilayer perceptron (MLP) with multiple hidden layers networks, popularly known as deep neural networks. The FRPN consists of an input layer, an output layer, and only one hidden layer in which the hidden layer neurons are fully connected with algebraic (instantaneous) connections, and not delayed connections. The FRPN is particularly attractive as an alternative to deep MLP since the FRPN eliminates the need of obtaining the number of hidden layers and the number of neurons per hidden layer. Some insight into the operational mechanisms of the FRPN is obtained through an application to a practical learning problem, viz., the handwritten digit recognition problem.