Pages: p. 4
The Future of Simulation: A Field of Dreams?
Joshua J. Yi, Lieven Eeckhout, David J. Lilja, Brad Calder, Lizy K. John, and James E. Smith
Currently, simulation is the approach of choice for quantitatively evaluating computer architecture performance. However, problems with the current simulation infrastructure, benchmarking, and simulation methodology can affect simulation results' accuracy or the time required for simulation, which might affect the conclusions drawn.
Researchers must encourage the use of analytical modeling and statistical theory in performance evaluation, and computer architecture education should include training for these tools and evaluation techniques.
From Molecule to Man: Decision Support in Individualized E-Health
Peter M.A. Sloot, Alfredo Tirado-Ramos, Ilkay Altintas, Marian Bubak, and Charles Boucher
The complete cascade of complex human systems—from genome, proteome, metabolome, and physiome to health—forms multiscale, multiscience systems and crosses many orders of magnitude in temporal and spatial scales. The interactions between these systems create exquisite multitiered networks, with each component in nonlinear contact with many interaction partners. These networks aren't just complicated, they're complex. Understanding, quantifying, and handling this complexity is one of the biggest scientific challenges of our time.
Computer science provides the language needed to study and understand these systems. Computer system architectures reflect the same laws and organizing principles used to build individualized biomedical systems, which can account for variations in physiology, treatment, and drug response.
Multiscale Modeling: Physiome Project Standards, Tools, and Databases
Peter J. Hunter, Wilfred W. Li, Andrew D. McCulloch, and Denis Noble
A characteristic of biological complexity is the intimate connection that exists between different length scales—from the nanometer-length scale of molecules to the highly structured meter scale of the whole human body. Subtle changes in molecular structure as a consequence of a single gene mutation can lead to catastrophic failure at the organ level, such as heart failure from reentrant arrhythmias that lead to ventricular fibrillation.
But information flows equally in the reverse direction: Mechanoreceptors at the cell level sense the mechanical load on the musculoskeletal system and influence gene expression via signal transduction pathways.
Thus, interpreting the interactions that occur across the length scales from genes and proteins to cells, tissues, organs, and organ systems requires a multiscale mathematical modeling framework. At present, the Physiome Project consists of the markup languages and associated tools for authoring, validating, displaying, and executing models, together with model databases that have been published in peer-reviewed journals.
CASA and LEAD: Adaptive Cyberinfrastructure for Real-Time Multiscale Weather Forecasting
Beth Plale, Dennis Gannon, Jerry Brotzge, Kelvin Droegemeier, Jim Kurose, David McLaughlin, Robert Wilhelmson, Sara Graves, Mohan Ramamurthy, Richard D. Clark, Sepi Yalda, Daniel A. Reed, Everette Joseph, and V. Chandrasekar
Whereas scientists generate today's forecasts on a fixed time schedule, new radar technologies and improved model physics are enabling on-demand forecasts in response to current weather events. These forecasts ingest regional atmospheric data in real time and can consume large computational resources in real time as well.
Two highly complementary projects are developing a hardware and software framework to enable real-time multiscale forecasting. Collaborative Adaptive Sensing of the Atmosphere and Linked Environments for Atmospheric Discovery are stand-alone systems that offer distinct benefits to their respective user communities, but when used together, promise a paradigm shift in atmospheric science research.
Designing an Integrated Architecture for Network Content Security Gateways
Ying-Dar Lin, Chih-Wei Jan, Po-Ching Lin, and Yuan-Cheng Lai
Increasing computer processing power has made it economically feasible to integrate multiple functions into a single gateway, but the industry remains divided over whether that is preferable to an all-in-one solution.
Researchers in this area have primarily focused on enhancing performance. Little research, however, has examined integrated content security architectures. To address this issue, the authors integrated five popular open source content security packages. Using a set of external and internal benchmarks, they compared packet-flow performance during content inspection by combining a loosely integrated arrangement with a tightly integrated content security gateway.
Their research indicated that this new architecture substantially reduces overhead in interprocess communications and kernel/user space interactions. The study also revealed that the main bottlenecks of content processing are string matching in Web filtering and RAM disk access in mail processing.