This Article 
 Bibliographic References 
 Add to: 
41st Annual Symposium on Foundations of Computer Science
Hierarchical placement and network design problems
Redondo Beach, California
November 12-November 14
ISBN: 0-7695-0850-2
S. Guha, Dept. of Comput. Sci., Stanford Univ., CA, USA
A. Meyerson, Dept. of Comput. Sci., Stanford Univ., CA, USA
K. Munagala, Dept. of Comput. Sci., Stanford Univ., CA, USA
Gives constant approximations for a number of layered network design problems. We begin by modeling hierarchical caching, where the caches are placed in layers and each layer satisfies a fixed percentage of the demand (bounded miss rates). We present a constant approximation to the minimum total cost of placing the caches and to the routing demand through the layers. We extend this model to cover more general layered caching scenarios, giving a constant combinatorial approximation to the well-studied multi-level facility location problem. We consider a facility location variant, the load-balanced facility location problem, in which every demand is served by a unique facility and each open facility must serve at least a certain amount of demand. By combining load-balanced facility location with our results on hierarchical caching, we give a constant approximation for the access network design problem.
Index Terms:
facility location; resource allocation; approximation theory; hierarchical systems; network synthesis; subscriber loops; cache storage; file organisation; hierarchical placement; layered network design problems; constant approximations; hierarchical caching; bounded miss rates; minimum total cost; routing demand; layered caching scenarios; combinatorial approximation; multi-level facility location problem; load-balanced facility location problem; open facilities; access network design problem
S. Guha, A. Meyerson, K. Munagala, "Hierarchical placement and network design problems," focs, pp.603, 41st Annual Symposium on Foundations of Computer Science, 2000
Usage of this product signifies your acceptance of the Terms of Use.