2018 IEEE 32nd International Conference on Advanced Information Networking and Applications (AINA) (2018)
May 16, 2018 to May 18, 2018
Demand for computational grid is growing ever since the rise of social networks, Artificial Intelligence based systems (use of machine learning), scientific application, etc. Service providers are aiming towards achieving maximum grid utilization so that they can serve more customers efficiently. The key enablers to achieve optimal grid utilization and better turnaround time is by efficient scheduling of tasks on computational grids. However, designing efficient grid scheduling algorithms is still a challenge due to its complexity (NP-complete). Hence, several near optimal approximation algorithms are designed based on plethora of techniques such as heuristics, bio inspired, genetic, greedy approaches, etc. Thus there is a scope for further improvement in scheduling algorithm to achieve early task completion and better grid utilization for precedence constrained tasks. Moreover, nowadays with advancement in computing hardware there is a preference for parallel task execution strategy rather than sequential task execution. With this there is also a notion of partial dependency between parallel tasks. Thus, there is a need to revisit designing grid scheduling algorithm. Hence, In this paper we propose a novel grid scheduling algorithm for interdependent parallel tasks with varying dependencies (full, partial, no) on computational grid using a greedy approach. Further the correctness and performance of the proposed scheduling algorithm is evaluated by comparing it with proposed brute force scheduling algorithm.
artificial intelligence, computational complexity, grid computing, network theory (graphs), optimisation, scheduling
S. Dibbur Byrappa, S. N. Hegde, R. M A and K. H K, "A Novel Task Scheduling Scheme for Computational Grids - Greedy Approach," 2018 IEEE 32nd International Conference on Advanced Information Networking and Applications (AINA), Krakow, Poland, 2018, pp. 1026-1033.