Title :
Neural network training algorithms on parallel architectures for finance applications
Author :
Thulasiram, Ruppa K. ; Rahman, Rashedur M. ; Thulasiraman, Parimala
Author_Institution :
Dept. of Comput. Sci., Manitoba Univ., Winnipeg, Man., Canada
Abstract :
We focus on the neural network training problem that could be used for price forecasting or other purposes in finance. We design and develop four different parallel and multithreaded backpropagation neural network algorithms: neuron and training set parallelism on a distributed memory architecture using MPI; loop-level (fine-grain) and coarse-grained parallelism in shared memory architecture using OpenMP. We have conducted various experiments to study the performance of these algorithms and compared our results with a traditional autoregression model to establish accuracy of our results. The comparison between our MPI and OpenMP results suggest that the training set parallelism performs better than all the other types of parallelism considered in the study.
Keywords :
backpropagation; costing; distributed shared memory systems; financial data processing; message passing; multi-threading; neural nets; parallel algorithms; parallel architectures; MPI; OpenMP; autoregression model; coarse grained parallelism; distributed memory architecture; finance applications; loop-level parallelism; multithreaded backpropagation neural network algorithms; neural network training algorithms; neuron parallelism; parallel architectures; parallel backpropagation neural network algorithms; price forecasting; shared memory architecture; training set parallelism; Algorithm design and analysis; Application software; Backpropagation algorithms; Economic forecasting; Finance; Memory architecture; Neural networks; Neurons; Parallel architectures; Parallel processing;
Conference_Titel :
Parallel Processing Workshops, 2003. Proceedings. 2003 International Conference on
Print_ISBN :
0-7695-2018-9
DOI :
10.1109/ICPPW.2003.1240376