This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A fluid model for layered queueing networks
June 2013 (vol. 39 no. 6)
pp. 744-756
M. Tribastone, Dept. for Inf., Ludwig-Maximilians Univ. of Munich, Munich, Germany
Layered queueing networks are a useful tool for the performance modeling and prediction of software systems that exhibit complex characteristics such as multiple tiers of service, fork/join interactions, and asynchronous communication. These features generally result in nonproduct form behavior for which particularly efficient approximations based on mean value analysis (MVA) have been devised. This paper reconsiders the accuracy of such techniques by providing an interpretation of layered queueing networks as fluid models. Mediated by an automatic translation into a stochastic process algebra, PEPA, a network is associated with a set of ordinary differential equations (ODEs) whose size is insensitive to the population levels in the system under consideration. A substantial numerical assessment demonstrates that this approach significantly improves the quality of the approximation for typical performance indices such as utilization, throughput, and response time. Furthermore, backed by established theoretical results of asymptotic convergence, the error trend shows monotonic decrease with larger population sizes-a behavior which is found to be in sharp contrast with that of approximate mean value analysis, which instead tends to increase.
Index Terms:
Approximation methods,Unified modeling language,Stochastic processes,Sociology,Statistics,Servers,Accuracy,mean value analysis,Modeling and prediction,Markov processes,PEPA,ordinary differential equations,queueing networks
Citation:
M. Tribastone, "A fluid model for layered queueing networks," IEEE Transactions on Software Engineering, vol. 39, no. 6, pp. 744-756, June 2013, doi:10.1109/TSE.2012.66
Usage of this product signifies your acceptance of the Terms of Use.