Skip to main content
placeholder image

A fully recursive perceptron network architecture

Conference Paper


Abstract


  • This paper introduces a fully recursive perceptron network (FRPN) architecture as an alternative to multilayer perceptron (MLP) with multiple hidden layers networks, popularly known as deep neural networks. The FRPN consists of an input layer, an output layer, and only one hidden layer in which the hidden layer neurons are fully connected with algebraic (instantaneous) connections, and not delayed connections. The FRPN is particularly attractive as an alternative to deep MLP since the FRPN eliminates the need of obtaining the number of hidden layers and the number of neurons per hidden layer. Some insight into the operational mechanisms of the FRPN is obtained through an application to a practical learning problem, viz., the handwritten digit recognition problem.

Publication Date


  • 2018

Citation


  • Hagenbuchner, M., Tsoi, A. C., Scarselli, F., & Zhang, S. J. (2018). A fully recursive perceptron network architecture. In 2017 IEEE Symposium Series on Computational Intelligence, SSCI 2017 - Proceedings Vol. 2018-January (pp. 1-8). doi:10.1109/SSCI.2017.8285325

Scopus Eid


  • 2-s2.0-85046093862

Web Of Science Accession Number


Start Page


  • 1

End Page


  • 8

Volume


  • 2018-January

Abstract


  • This paper introduces a fully recursive perceptron network (FRPN) architecture as an alternative to multilayer perceptron (MLP) with multiple hidden layers networks, popularly known as deep neural networks. The FRPN consists of an input layer, an output layer, and only one hidden layer in which the hidden layer neurons are fully connected with algebraic (instantaneous) connections, and not delayed connections. The FRPN is particularly attractive as an alternative to deep MLP since the FRPN eliminates the need of obtaining the number of hidden layers and the number of neurons per hidden layer. Some insight into the operational mechanisms of the FRPN is obtained through an application to a practical learning problem, viz., the handwritten digit recognition problem.

Publication Date


  • 2018

Citation


  • Hagenbuchner, M., Tsoi, A. C., Scarselli, F., & Zhang, S. J. (2018). A fully recursive perceptron network architecture. In 2017 IEEE Symposium Series on Computational Intelligence, SSCI 2017 - Proceedings Vol. 2018-January (pp. 1-8). doi:10.1109/SSCI.2017.8285325

Scopus Eid


  • 2-s2.0-85046093862

Web Of Science Accession Number


Start Page


  • 1

End Page


  • 8

Volume


  • 2018-January