Pages: pp. 4-5
The IEEE Computer Society offers a lineup of 13 peer-reviewed technical magazines that cover cutting-edge topics in computing including scientific applications, design and test, security, Internet computing, machine intelligence, digital graphics, and computer history. Select articles from recent issues of Computer Society magazines are highlighted below.
Component-based software engineering poses new challenges for predicting software performance, but it also offers several advantages. In "Facilitating Performance Predictions Using Software Components" in the May/June issue of Software, authors Jens Happe, Heiko Koziolek, and Ralf Reussner discuss a simplified system implementation that demonstrates compositional reasoning about software performance. Developers can download and use the open source tool Palladio.
In "Improving Users' Mental Models of Intelligent Software Tools" in the March/April issue of Intelligent Systems, authors Shane T. Mueller and Gary Klein describe the Experiential User Guide, which is designed to address the genuine cognitive challenges users have with the complex, intelligent software tools used in everyday life—commercial navigation systems and Web search algorithms are just two examples. The guide exposes learners to many of the experiences that an expert will have had over time, allowing both novice and expert users to experience a tool's strengths and weaknesses.
Current colorization based on image segmentation makes it difficult to add or update color reliably and requires considerable user intervention. Authors from the Chinese University of Hong Kong and the Chinese Academy of Sciences describe a new approach that gives similar colors to pixels with similar texture features. The method uses rotation-invariant Gabor filter banks and applies optimization in the feature space. Read more in "Colorization Using the Rotation-Invariant Feature Space" in the March/April issue of CG&A.
Supercomputers are usually designed to achieve the highest possible performance. Their architectures have evolved from early custom-design systems to today's clusters of commodity multisocket, multicore systems. Twice a year, the supercomputing community ranks these systems (using the number of 64-bit floating-point operations per second) and produces the Top500 list ( www.top500.org) showing the world's 500 highest-performing machines.
In "Trends in High-Performance Computing" in the May/June 2011 issue of CiSE, authors Volodymyr Kindratenko and Pedro Trancoso describe the technologies used in today's top-ranked machines and forecast next-generation supercomputing architecture trends.
Data leaks are often the result of usability failures. In healthcare, usability failures risk both patients' health and their identity. In "Usability Failures and Healthcare Data Hemorrhages," in the March/April issue of S&P, authors M. Eric Johnson and Nicholas D. Willey of Dartmouth College analyze samples of medical-related files collected from peer-to-peer file-sharing networks. These leaked files contained significant protected health information and demonstrate the risk to patients and institutions. Through interviews and field research, Johnson and Willey document how usability failures lead to such hemorrhages.
Reducing domestic energy consumption is a hot topic in the pervasive computing and computer-human interaction communities, with a long and varied history. In "Look Back before Leaping Forward: Four Decades of Domestic Energy Inquiry," authors Mike Hazas, Adrian Friday, and James Scott give a brief overview of the history and current state of this problem from various perspectives—from pervasive computing feedback-oriented and technology-centric systems, to sociology- and economics-based studies. The article provides an introductory set of references to aid readers in exploring this topic's rich background in depth.
The measurement, characterization, and modeling of real workloads are key steps driving the design of cost-effective Internet applications and services. However, some of the most fundamental concepts and methods for characterizing Internet workloads are largely unknown to many Internet and Web practitioners.
Internet Computing's March/April special issue on Internet workloads presents articles that characterize and model different workload types for grid computing, HTTP forward caching, and multimedia services. The articles cover popular applications and draw insights useful for system planning, management, and optimization.
On-chip networks could become a critical shared resource in many-core systems. If so, devising efficient and fair scheduling strategies is a particularly important and challenging performance component. In "Aérgia: A Network-on-Chip Exploiting Packet Latency Slack" in the January/February issue of Micro, researchers from Pennsylvania State University, Carnegie Mellon University, and Microsoft Research describe a novel NoC architecture that uses router prioritization mechanisms based on packet slack—a measure of the lag possible in relation to the effect of its delivery on application performance.
"While nearly everyone can agree that pecking out a love note on a tiny mobile phone keypad while simultaneously trying to operate a vehicle is a bad idea, what about the other activities that we perform on a day-to-day basis using the electronic devices either built into or brought into our cars?"
Authors Christian Müller of the German Research Institute for Artificial Intelligence and Garrett Weinberg of Mitsubishi Electric Research Labs start with this question in "Multimodal Input in the Car, Today and Tomorrow," in the January/March issue of MultiMedia. They give a brief overview of multimodal theory as it pertains to common in-vehicle tasks and devices before describing the state of the art and suggesting ways to safely broaden in-vehicle system capabilities in the future.
RFID technologies have been revolutionizing the way we perform asset tracking for more than 30 years, and RFID applications are increasingly appearing in areas such as transportation, banking, healthcare, and security—primarily owing to increased reliability, widespread adoption of international standards, and decreased costs. A passive ultrahigh-frequency tag is expected to cost just a few cents this year, prompting continued RFID growth. In its March/April 2011 special issue, IT Pro presents five articles that report recent experiences in developing applications, middleware, and security protocols for applying RFID in real-time systems.
Articles in the March/April issue of Design & Test highlight a range of problems that IC and system designers face in ensuring high performance under tight power constraints and escalating verification and test costs. In "Customizable Domain-Specific Computing," researchers from the University of California and Rice University show how to learn from nature—the human brain—to design a platform that achieves its efficiencies through customization.
"Kissinger's Computer: National Security Council Computerization, 1969-1972," a feature article in the January-March 2011 issue of Annals, recounts the history of Henry Kissinger's information-automation project during the Nixon administration. The project introduced computers into the White House and standardized information management within the US National Security Council.
"Beyond the NSC's day-to-day activities, the information-automation projects had a broader impact on US national security policy …," writes author John Laprise of Northwestern University. "By the end of the Nixon administration, the NSC saw computers as a primary information-management technology that could be applied to the economic spectrum and needed to be controlled to limit the economic strength of communist nations."