DocumentCode :
1574356
Title :
Data-Flow vs Control-Flow for Extreme Level Computing
Author :
Evripidou, Paraskevas ; Kyriacou, Costas
Author_Institution :
Dept. of Comput. Sci., Univ. of Cyprus, Nicosia, Cyprus
fYear :
2013
Firstpage :
9
Lastpage :
13
Abstract :
This paper challenges the current thinking for building High Performance Computing (HPC) Systems, which is currently based on the sequential computing also known as the von Neumann model, by proposing the use of Novel systems based on the Dynamic Data-Flow model of computation. The switch to Multi-core chips has brought the Parallel Processing into the mainstream. The computing industry and research community were forced to do this switch because they hit the Power and Memory walls. Will the same happen with HPC? The United States through its DARPA agency commissioned a study in 2007 to determine what kind of technologies will be needed to build an Exaflop computer. The head of the study was very pessimistic about the possibility of having an Exaflop computer in the foreseeable future. We believe that many of the findings that caused the pessimistic outlook were due to the limitations of the sequential model. A paradigm shift might be needed in order to achieve the affordable Exascale class Supercomputers.
Keywords :
mainframes; parallel processing; DARPA agency; Exaflop computer; HPC systems; United States; control-flow computation model; data-flow computation model; exascale class supercomputers; extreme level computing; high performance computing systems; parallel processing; sequential computing; von Neumann model; Computational modeling; Context; Instruction sets; Parallel processing; Runtime; Supercomputers; Synchronization; Data-Flow; Exascale; HPC; Supercomputing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data-Flow Execution Models for Extreme Scale Computing (DFM), 2013
Conference_Location :
Edinburgh
Type :
conf
DOI :
10.1109/DFM.2013.17
Filename :
6919190
Link To Document :
بازگشت