Pages: pp. 4-5
The IEEE Computer Society offers a lineup of 13 peer-reviewed technical magazines that cover cutting-edge topics in computing including scientific applications, design and test, security, Internet computing, machine intelligence, digital graphics, and computer history. Select articles from recent issues of Computer Society magazines are highlighted below.
A computer system's security can be compromised in many ways—a denial-of-service attack can make a server inoperable, a worm can destroy a user's private data, or an eavesdropper can reap financial rewards by inserting himself in the communication link between a customer and her bank through a man-in-the-middle attack. We assume that the computers under attack are operated by benign and trusted users. If an untrustworthy user gains access to a system, this security paradigm breaks down.
Guest editors Paolo Falcarin of the University of East London, Christian Collberg of the University of Arizona, Mikhail Atallah of Purdue University, and Mariusz Jakubowski of Microsoft Research introduce the March/April issue of Software, which focuses on protecting software from both inside and outside attacks.
The Web is a critical global infrastructure. Since its emergence in the mid-1990s, it has exploded into hundreds of billions of pages that touch most aspects of modern life. Today the jobs of more and more people depend on the Web. Media, banking, and healthcare are being revolutionized by it, and governments are even considering how to run their countries with it. This is the second half of a two-part special issue with some of the best articles from the inaugural Web Science Conference. The five articles in the January/February issue of IS, joined with six that ran in a previous issue, show the scope and scale of the many facets of the emerging science of the Web and illustrate some of the many ways the Web can be studied.
Real-time rendering of large-scale vector maps over terrain surfaces requires displaying substantial numbers of polylines and polygons. A new method efficiently simplifies and renders such maps. First, it simplifies the vector map while maintaining the map's topological consistency and preventing local conflicts such as intersections or self-intersections. Second, it generates view-dependent level-of-detail (LOD) models. Finally, it overlays the maps onto multiresolution terrain models through the stencil shadow volume algorithm and other techniques. Read more in "Efficient Simplification of Large Vector Maps Rendered onto 3D Landscapes," in the March/April issue of CG&A.
As the relationship between research and computing evolves, new tools are required to not only treat numerical problems, but also to solve various problems that involve large datasets in different formats, new algorithms, and computational systems such as databases and Internet servers. Python can help in developing these computational research tools by providing a balance of clarity and flexibility without sacrificing performance. Read more in "Python: An Ecosystem for Scientific Computing," in the March/April issue of CiSE.
Economic downturns create many stresses on firms as they seek to maintain profitability while serving their stakeholders. The January/February issue of S&P offers several points of view on ensuring information security within resource constraints.
In "Addressing Information Risk in Turbulent Times," authors M. Eric Johnson and Shari Lawrence Pfleeger of Dartmouth College present discussions and interviews with chief information security officers from a broad range of large firms about how they addressed the challenges of the economic downturn and provide both actionable ideas and clues for future research.
In the push to develop smart energy systems, designers have increasingly focused on systems that measure and predict user behavior to effect optimal energy consumption. While such focus is clearly an important component in these systems' success, designers pay substantially less attention to the human on the other side of the energy system loop—the supervisors of power-generation processes.
In "Paying Attention to the Man behind the Curtain" in the January/March issue of PvC, authors Mary L. (Missy) Cummings and Kristopher M. Thornburg of the Massachusetts Institute of Technology predict that pervasive computing will likely add to an already complex array of data streams and introduce a new layer of supervisory complexity when dynamically adapting energy management.
Provenance, from the French word provenir, "to come from," means the origin, or the source, of something, or the history of an object's ownership or location. A digital object's provenance (also referred to as its audit trail or lineage) contains information about the process and data used to derive the object. Provenance provides important documentation that's vital to preserving data, determining data quality and authorship, and reproducing and validating results. As increasing volumes of data are shared and modified over the Web, it's crucial to track their provenance for business, scientific, and social networking applications. A special issue of IC provides a snapshot of ongoing work in this area.
ReMAP is a reconfigurable architecture for accelerating and parallelizing applications within a heterogeneous chip multiprocessor. Clusters of cores share a common reconfigurable fabric adaptable for individual thread computation or fine-grained communication with integrated computation. In "ReMAP: A Reconfigurable Architecture for Chip Multiprocessors," in the January/February issue of Micro, authors Matthew A. Watkins of Harvey Mudd College and David H. Albonesi of Cornell University find that ReMAP demonstrates significantly higher performance and energy efficiency than hard-wired communication-only mechanisms.
The January/February issue of MultiMedia samples the state of the art in large-scale multimedia analysis techniques and explores how they can be leveraged to address the challenges in large-scale data collections. MultiMedia's guest editors present five representative articles that investigate large-scale multimedia analysis theory and systems across multiple application domains, such as Web event detection, landmark detection, image annotation, musical content mining, and cloud computing.
An increased awareness of the harmful effects of greenhouse gas emissions combined with new stringent environmental legislation, concerns about electronic waste disposal practices, and corporate image concerns are pushing businesses and individuals to go green. Information technology has fundamentally altered our work and lives and improved our productivity, economy, and social well-being. IT now has a new role to play in creating a sustainable environment. Articles in the January/February special issue of IT Pro cover the next wave of green IT, assessments of green initiatives, the green potential of RFID, green Web browsing, and lessons from case studies on real-world green initiatives.
Hybrid embedded testbench acceleration (HETA), a new approach to reduce communication overhead in hardware accelerators, speeds up simulation of chip prototypes by avoiding the communication between hardware and software. Experimental results using an industry design show that the proposed HETA approach is about 10 times faster than a commercial hardware accelerator and with only 0.57 percent hardware overhead. Read more in "Hybrid Testbench Acceleration for Reducing Communication Overhead" by Chin-Lung Chuang and Chien-Nan (Jimmy) Liu of Taiwan's National Central University in the March/April issue of D&T.
The January-March issue of Annals contains a two-part cover article in which B. Jack Copeland engagingly revisits the Manchester Baby project, highlighting the important roles of mathematicians Max Newman, I.J. Good, and Alan Turing. Copeland challenges traditional interpretations by relating how Turing tutored Tom Kilburn in computer architecture, how Good's instruction set provided the basis for the Baby's instruction set, and how Newman profoundly influenced F.C. Williams and Kilburn's overall design decisions. Also in this issue, John Laprise provides an important and highly original contribution to the history of government computing with his article on the adoption of computers by the National Security Council during the Nixon administration.