The Community for Technology Leaders
2013 IEEE 5th International Conference on Cloud Computing Technology and Science (2013)
Bristol, United Kingdom United Kingdom
Dec. 2, 2013 to Dec. 5, 2013
pp: 1-8
ABSTRACT
Although Cloud computing emerged for business applications in industry, public Cloud services have been widely accepted and encouraged for scientific computing in academia. The recently available Google Compute Engine (GCE) is claimed to support high-performance and computationally intensive tasks, while little evaluation studies can be found to reveal GCE's scientific capabilities. Considering that fundamental performance benchmarking is the strategy of early-stage evaluation of new Cloud services, we followed the Cloud Evaluation Experiment Methodology (CEEM) to benchmark GCE and also compare it with Amazon EC2, to help understand the elementary capability of GCE for dealing with scientific problems. The experimental results and analyses show both potential advantages of, and possible threats to applying GCE to scientific computing. For example, compared to Amazon's EC2 service, GCE may better suit applications that require frequent disk operations, while it may not be ready yet for single VM-based parallel computing. Following the same evaluation methodology, different evaluators can replicate and/or supplement this fundamental evaluation of GCE. Based on the fundamental evaluation results, suitable GCE environments can be further established for case studies of solving real science problems.
INDEX TERMS
Benchmark testing, Cloud computing, Throughput, Google, Measurement, Standards, Australia,Scientific Computing, Cloud Services Evaluation, Google Compute Engine, Public Cloud Service
CITATION
Zheng Li, Liam OBrien, Rajiv Ranjan, Miranda Zhang, "Early Observations on Performance of Google Compute Engine for Scientific Computing", 2013 IEEE 5th International Conference on Cloud Computing Technology and Science, vol. 01, no. , pp. 1-8, 2013, doi:10.1109/CloudCom.2013.7
85 ms
(Ver 3.3 (11022016))