Issue No. 02 - February (2003 vol. 36)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MC.2003.10014
DUAL TRAGEDIES: IP RIGHTS IN INDUSTRY STANDARDS, PP. 25-27
Companies often participate in a cooperative standard-setting effort to minimize research and development risks. However, the industry's aggressive use of patents forces standard-setting organizations (SSOs) to struggle with producing the best standard technically while encouraging its widespread adoption by limiting the proprietary technology that users must license.
To balance these interests, many SSOs require members to disclose any patents related to a proposed standard. These disclosures potentially expose technical and market strategies to competitors. By illuminating this problem, the "tragedy of the commons" and "tragedy of the anticommons" can help SSOs implement disclosure policies that reduce member burdens and risks, thereby encouraging participation in creating high-quality standards.
BENCHMARKING INTERNET SERVERS ON SUPERSCALAR MACHINES, PP. 34-40
Yue Luo, Juan Rubio, Lizy Kurian John, Pattabi Seshadri, and Alex Mericas
Today's superscalar microprocessors tend to execute instructions in an order different from the instruction sequence fed to them. With the aid of sophisticated branch predictors, they identify the program flow's potential path to find instructions that can be executed in advance.
CPU-intensive benchmarks have been widely used to evaluate these processors. The authors report a study demonstrating that to maximize performance on Internet server applications, modern processor architectures need further enhancements and optimizations, particularly in memory system design.
TPC-W E-COMMERCE BENCHMARK EVALUATION, PP. 42-48
Daniel F. García and Javier García
Correctly interpreting benchmark results requires a basic knowledge of the synthetic workload the benchmark uses to determine how well it represents diverse e-commerce applications' real-world workloads. Factors that influence these results include the characteristics of the system under test, the procedures used to execute the tests, and the performance metrics the benchmark generates.
TPC-W performs server evaluation in a controlled Internet e-commerce environment that simulates the activities of a business-oriented transactional Web server. The authors used experimental results from their TPC-W implementation to assess the benchmark's behavior, including its granularity and sensitivity to changes in workload and system parameters.
SIMULATING A $2M COMMERCIAL SERVER ON A $2K PC, PP. 50-57
Alaa R. Alameldeen, Milo M.K. Martin, Carl J. Mauer, Kevin E. Moore, Min Xu, Mark D. Hill, David A. Wood, and Daniel J. Sorin
As dependence on database management systems and Web servers increases, so does the need for them to run reliably and efficiently—goals that rigorous simulations can help achieve. Execution-driven simulation models system hardware. These simulations capture actual program behavior and detailed system interactions.
The authors have developed a simulation methodology that uses multiple simulations, pays careful attention to the effects of scaling on workload behavior, and extends Virtutech AB's Simics full-system functional simulator with detailed timing models.
QUEUING SIMULATION MODEL FOR MULTIPROCESSOR SYSTEMS, PP. 58-64
Thin-Fong Tsuei and Wayne Yamamoto
The processor queuing model provides memory-hierarchy and system-design evaluation of memory-intensive commercial online-transaction-processing workloads on large multiprocessor systems. It differs from detailed cycle-accurate and direct-execution simulations in that it does not simulate instruction execution. Instead, as in analytical models, the authors build processor and workload characteristics that are easy to collect and estimate.
Because the authors believe that the processor model's function is to accurately generate memory traffic to the rest of the system, they model a minimal set of processor and workload characteristics that captures the important interactions between a complex processor and the system-memory hierarchy.
COMPUTER ARCHITECTURE RESEARCH WORKLOADS, PP. 65-71
Lieven Eeckhout, Hans Vandierendonck, and Koen De Bosschere
Although architectural simulators model microarchitectures at a high abstraction level, the increasing complexity of both the microarchitectures themselves and the applications that run on them make simulator use extremely time-consuming. Simulators must execute huge numbers of instructions to create a workload representative of real applications, creating an unreasonably long simulation time and stretching the time to market.
Using reduced input sets instead of reference input sets helps to solve this problem. The authors have developed a methodology that reliably quantifies program behavior similarity to verify if reduced input sets result in program behavior similar to the reference inputs.