The Community for Technology Leaders
2018 IEEE 34th International Conference on Data Engineering (ICDE) (2018)
Paris, France
Apr 16, 2018 to Apr 19, 2018
ISSN: 2375-026X
ISBN: 978-1-5386-5520-7
pp: 1252-1255
ABSTRACT
To fully use the advanced resources of a main memory database cluster, we take independent parallelism into account to parallelize multiple pipelines of one query. However, scheduling resources to multiple pipelines is an intractable problem. Traditional static approaches to this problem may lead to a serious waste of resources and suboptimal execution order of pipelines, because it is hard to predict the actual data distribution and fluctuating workloads at compile time. In response, we propose a dynamic scheduling algorithm, List with Filling and Preemption (LFPS), based on two techniques. (1) Adaptive filling improves resource utilization by issuing more extra pipelines to adaptively fill idle resource "holes" during execution. (2) Cost-based preemption strictly guarantees scheduling the pipelines on a critical path first at run time. We implement LFPS in our prototype database system. Under the workloads of TPC-H, experiments show our work improves the finish time of parallelizable pipelines from one query up to 2.3X than a static approach and 1.7X than a serialized execution.
INDEX TERMS
data handling, database management systems, query processing, resource allocation, scheduling, storage management
CITATION

Z. Fang, C. Weng, L. Wang and A. Zhou, "Parallelizing Multiple Pipelines of One Query in a Main Memory Database Cluster," 2018 IEEE 34th International Conference on Data Engineering (ICDE), Paris, France, 2018, pp. 1252-1255.
doi:10.1109/ICDE.2018.00123
468 ms
(Ver 3.3 (11022016))