Title :
HMPI: towards a message-passing library for heterogeneous networks of computers
Author :
Lastovetsky, Alexey ; Reddy, Ravi
Author_Institution :
Dept. of Comput. Sci., Univ. Coll. Dublin, Ireland
Abstract :
The paper presents Heterogeneous MPI (HMPI), an extension of MPI for programming high-performance computations on heterogeneous networks of computers. It allows the application programmer to describe the performance model of the implemented algorithm. This model allows for all the main features of the underlying parallel algorithm, which have an impact on its execution performance, such as the total number of parallel processes, the total volume of computations to be performed by each process, the total volume of data to be transferred between each pair of the processes, and how exactly the processes interact during the execution of the algorithm. Given the description of the performance model, HMPI creates a group of processes executing the algorithm faster than any other group of processes. The most principal extensions to MPI are presented. Parallel simulation of the interaction of electric and magnetic fields and parallel matrix multiplication are used to demonstrate the features of the library.
Keywords :
application program interfaces; digital simulation; message passing; parallel algorithms; workstation clusters; HMPI; application programmer; electric fields; heterogeneous MPI; heterogeneous networks of computers; high-performance computations; magnetic fields; message-passing library; parallel algorithm; parallel matrix multiplication; parallel simulation; performance model; Application software; Communication networks; Computer networks; Concurrent computing; Distributed computing; Fault tolerance; Libraries; Programming profession; Protocols; Workstations;
Conference_Titel :
Parallel and Distributed Processing Symposium, 2003. Proceedings. International
Print_ISBN :
0-7695-1926-1
DOI :
10.1109/IPDPS.2003.1213210