Home | Issues | Profile | History | Submission | Review
Vol: 48(62) No: 1 / March 2003      

A Parallel Algorithm for Neural Networks
S. Babii
Department of Computers, Faculty of Automatics and Computers, "Politehnica" University of Timisoara, V.Parvan no. 2, Timisoara, Romania, phone: (+40) 256-404059, e-mail: sorin@cs.utt.ro, web: http://www.cs.utt.ro/~sorin
V. Cretu
Department of Computers, Faculty of Automatics and Computers, "Politehnica" University of Timisoara, V.Parvan no. 2, Timisoara, Romania, phone: (+40) 256-403255, e-mail: vcretu@cs.utt.ro, web: http://www.cs.utt.ro/~vcretu


Keywords: neural networks, training algorithms, parallel computations, back-propagation.

Abstract
This article presents the results of some experiments in parallelizing the training phase of a feed-forward, artificial neural network. More specifically, we develop and analyze a parallelization strategy of the widely used neural net learning algorithm called back-propagation. We describe a strategy for parallelizing the back-propagation algorithm. We implemented this algorithm on several LANs, permitting us to evaluate and analyze their performances based on the results of actual runs. We were interested on the qualitative aspect of the analyses, in order to achieve a fair understanding of the factors determining the behavior of these parallel algorithms. We were interested in discovering and dealing with some of the specific circumstances that have to be considered when a parallelized neural net learning algorithm is to be implemented on a set of workstations in a LAN. Part of our purpose is to investigate whether it is possible to exploit the computational resources of such a set of workstations.

References
[1] Rumelhart, D. E., Hinton, G. E. and Williams, R. J., \"Learning Internal Representations by Error Propagation\", in D. E. Rumelhart and J. L. McClelland (Eds.), Parallel Distributed Processing -- Explorations in the Microstructure of Cognition, Volume 1 - Foundations. A Bradford Book, MIT Press, 1986, pp. 318-362.
[2] A. Pétrowski, L. Personnaz, G. Dreyfus, C. Girault, “Parallel Implementations of Neural Network Simulations”, in Hypercube and Distributed Computers, pp. 205-218, F. André & J.P. Verjus Eds., Amsterdam: North Holland, 1989.
[3] A. Singer, “Implementations of Artificial neural networks on the Connection Machine”, Parallel Computing, vol. 14, pp. 305-315, 1990.
[4] M. Cosnard, J.C. Mignot, H. Paugam-Moisy, “Implementations of Multilayer Neural Networks on Parallel Architectures”, 2nd International Specialist Seminar on “Parallel Digital Processors”, Lisbonne, April 1991.
[5] Alain Petrowski and G\'erard Dreyfus and Claude Girault, \"Performance analysis of a pipelined backpropagation algorithm,\" IEEE Trans. on Neural Networks, vol. 4, pp. 970--981, November 1993.
[6] Jim Tørresen. “Parallelization of Backpropagation Training for Feed-Forward Neural Networks”. Ph.D. thesis, 1996.
[7] Jim Torresen and Shinji Tomita. “A Review of Parallel Implementations of Backpropagation Neural Networks.” Chapter 2 in the book by N. Sundararajan and P. Saratchandran (editors): Parallel Architectures for Artificial Neural Networks, IEEE CS Press, 1998. ISBN 0-8186-8399-6.