DocumentCode :
3301039
Title :
MC framework: High-performance distributed framework for standalone data analysis packages over Hadoop-based cloud
Author :
Chao-Chun Chen ; Nguyen Huu Tinh Giang ; Tzu-Chao Lin ; Min-Hsiung Hung
Author_Institution :
Dept. of Comp. Sci. & Inf. Eng., Nat. Cheng Kung Univ., Tainan, Taiwan
fYear :
2013
fDate :
13-15 Dec. 2013
Firstpage :
27
Lastpage :
32
Abstract :
The Hadoop MapReduce is the programming model of designing the scalable distributed computing applications, that provides developers can attain automatic parallelization. However, most complex manufacturing systems are arduous and restrictive to migrate to private clouds, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum efforts on modifying source codes, a high-performance framework is designed in this paper, called Multi-users-based Cloud-Adaptor Framework (MC-Framework), which provides the simple interface to users for fairly executing requested tasks worked with traditional standalone data analysis packages in MapReduce-based private cloud environments. Moreover, this framework focuses on multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, a new scheduling mechanism, called Job-Sharing Scheduling, is designed to explore and fairly share the jobs to machines in the private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.
Keywords :
cloud computing; data analysis; distributed processing; job shop scheduling; manufacturing systems; Hadoop MapReduce; Hadoop-based cloud; MC-Framework; automatic parallelization; complex manufacturing systems; data analysis packages; high-performance distributed framework; job-sharing scheduling; multiusers-based cloud-adaptor framework; scalable distributed computing; system reconstruction; Adaptation models; Cloud computing; Computational modeling; Data analysis; Data models; Job shop scheduling; Mathematical model; Hadoop; MapReduce; cloud adaptor; multi-users scheduling;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Granular Computing (GrC), 2013 IEEE International Conference on
Conference_Location :
Beijing
Type :
conf
DOI :
10.1109/GrC.2013.6740375
Filename :
6740375
Link To Document :
بازگشت