DocumentCode
606387
Title
CloudBench: Experiment Automation for Cloud Environments
Author
Silva, M. ; Hines, M.R. ; Gallo, Daniele ; Qi Liu ; Kyung Dong Ryu ; da Silva, Dilma
Author_Institution
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
fYear
2013
fDate
25-27 March 2013
Firstpage
302
Lastpage
311
Abstract
The growth in the adoption of cloud computing is driven by distinct and clear benefits for both cloud customers and cloud providers. However, the increase in the number of cloud providers as well as in the variety of offerings from each provider has made it harder for customers to choose. At the same time, the number of options to build a cloud infrastructure, from cloud management platforms to different interconnection and storage technologies, also poses a challenge for cloud providers. In this context, cloud experiments are as necessary as they are labor intensive. Cloud Bench [1] is an open-source framework that automates cloud-scale evaluation and benchmarking through the running of controlled experiments, where complex applications are automatically deployed. Experiments are described through experiment plans, containing directives with enough descriptive power to make the experiment descriptions brief while allowing for customizable multi-parameter variation. Experiments can be executed in multiple clouds using a single interface. Cloud Bench is capable of managing experiments spread across multiple regions and for long periods of time. The modular approach adopted allows it to be easily extended to accommodate new cloud infrastructure APIs and benchmark applications, directly by external users. A built-in data collection system collects, aggregates and stores metrics for cloud management activities (such as VM provisioning and VM image capture) and application runtime information. Experiments can be conducted in a highly controllable fashion, in order to assess the stability, scalability and reliability of multiple cloud configurations. We demonstrate Cloud Bench´s main characteristics through the evaluation of an Open Stack installation, including experiments with approximately 1200 simultaneous VMs at an arrival rate of up to 400 VMs/hour.
Keywords
application program interfaces; cloud computing; public domain software; API; CloudBench; Open Stack installation; cloud computing; cloud environments; cloud management platforms; experiment automation; open-source framework; Abstracts; Automation; Benchmark testing; Concrete; Data collection; Measurement; Scalability; Benchmark; Cloud computing; Virtual Machines;
fLanguage
English
Publisher
ieee
Conference_Titel
Cloud Engineering (IC2E), 2013 IEEE International Conference on
Conference_Location
Redwood City, CA
Print_ISBN
978-1-4673-6473-7
Type
conf
DOI
10.1109/IC2E.2013.33
Filename
6529297
Link To Document