Pages: pp. 4-5
The IEEE Computer Society offers a lineup of 13 peer-reviewed technical magazines that cover cutting-edge topics in computing including scientific applications, design and test, security, Internet computing, machine intelligence, digital graphics, and computer history. Select articles from recent issues of Computer Society magazines are highlighted below.
The Eclipse Rich Client Platform offers a much richer user experience than is possible via Web technologies. It lets developers build integrated development environment, non-IDE, and even non-GUI applications for the client as well as the server side. Although Eclipse RCP's basic concepts are easy to grasp, users face an extremely steep learning curve. In "Composing Systems with Eclipse Rich Client Platform Plug-Ins" in the November/December issue of Software, authors Andreas Kornstädt and Eugen Reiswich describe how to get started and use their hands-on experiences as a guide.
Facing the critical mission of international security and various data and technical challenges, society has a pressing need for a science of security informatics. This includes advanced information technologies, systems, algorithms, and databases for security-related applications using an integrated technological, organizational, and policy-based approach. Intelligent systems have much to contribute to this emerging field. A set of three articles in the September/October issue of Intelligent Systems highlights unique, innovative research frameworks, computational methods, and selected results and examples.
"Establishing Trust in Cloud Computing" is one of four feature articles in IT Pro's September/October special issue on hot topics in cloud computing. In addition, the issue includes two columns on the topic. In "Ethics and the Cloud," IT Pro editorial board members Keith W. Miller and Jeffrey Voas look at incremental steps in a larger "virtualization" of human civilization. In "Time to Push the Cloud," John Walz of AT&T Lucent and David Alan Grier of George Washington University call for the Computer Society to lead the development of standards to build trust and solve problems in cloud data exchange, interoperability, and security.
Visualization is at a point in its development where practitioners frequently find themselves grappling with big questions about its nature and purpose. Classical visualization theory sees this as a process of encoding data variables as visual variables, which the viewer then decodes. Experiments on how design affects users' interpretations of simple visualizations suggest that structural elements such as borders, fills, and arrangement (in addition to the traditional marks) carry significant, predictable semantic information. Drawing on these findings as well as design traditions, authors Caroline Ziemkiewicz and Robert Kosara argue in "Beyond Bertin: Seeing the Forest Despite the Trees," in the September/October issue of CG&A, that visual structure's apparent dynamics play a major role in a user's understanding of data and must be considered in the design and evaluation of visualizations.
Years ago, users stored data on a physical medium using a programming language's low-level read and write function. In fact, all I/O libraries were tightly bound to the underlying OS, and users lost data as computers or the programming languages themselves changed. Currently, we have open systems and it's common to go from one platform to another. Decades ago, such capabilities were extra features; today, they're mandatory. The question is, how can we be sure that data follows the workflow in a sound, trustable way? In the September/October issue of CiSE, author Marc Poinot describes one such storage option, Hierarchical Data Format, based on his experiences at French aerospace lab Onera, migrating data storage for the computational fluid dynamics general notation standard to HDF.
Securing operating systems has become increasingly difficult as their size and complexity continue to grow. New advances in hard disk technologies, however, provide a means to help manage this complexity. The proposed SwitchBlade architecture provides isolation for multiple OSs running on a single machine by confining them into segments that users can only access with a physical token. In "New Security Architectures Based on Emerging Disk Functionality" in the September/October issue of S&P, a team of authors from Penn State shows that the isolation guarantees characterizing SwitchBlade are equivalent to physically separate systems but without the traditional usability burdens.
Pervasive computing technology can save lives by both eliminating the need for humans to work in hostile environments and supporting them when they do. In general, environments that are hazardous to humans are hard on technology as well. The October-December issue of Pervasive Computing contains three articles and a Spotlight column that illustrate the challenges of designing this technology and implementing it in hostile environments.
Internet Computing has four new departments and columns. Expanding the Global Internet, coedited by Barry Leiba and members of the Internet Society, covers Internet issues in developing nations; Beyond Wires, edited by Cecilia Mascolo, covers mobile computing topics; View from the Cloud, edited by George Pallis, discusses cloud computing and the Internet; and Backspace, edited by Vint Cerf, covers trends and hot topics.
"Rethinking Flash in the Data Center" is one of eight articles selected for Micro's July/August special issue on datacenter-scale computing. David G. Andersen of Carnegie Mellon University and Steve Swanson of the University of California, San Diego, argue that flash memory is poised to make deep inroads into the datacenter. However, they see limitations arising from its deployment as a drop-in replacement for existing disk and DRAM technologies. The challenge to extending flash memory's performance advantages is to define interfaces and abstractions that make it easy to manage its unique quirks.
"Accelerating the Media Business with MPEG Extensible Middleware," in the July-September issue of MultiMedia, reviews the ISO/IEC MPEG working group's MXM set of protocols and APIs for improving interoperability in the increasingly heterogeneous multimedia content-delivery chain. The article summarizes the standard's purpose, architecture, protocols, and open issues. It also includes an example MXM application.
The 2010 Design Automation Conference included a session titled "Computing without Guarantees" that looked at the potential of computing platforms that don't conform to the traditional, axiomatic notion of specification and implementation of electronic system design. Some application domains—digital signal processing, for example—don't require such a strong notion of equivalence, owing to the presence of noise in the input data and the limited perceptual ability of humans consuming their output. A roundtable in D&T's September/October issue captures participants' discussion of the computing possibilities for this "inherent resilience."
The July-September issue of Annals includes a short essay, "The Network Information Center and Its Archives," in its Anecdotes department. Elizabeth Feinler, director of SRI's Network Information Systems Center until 1989, discusses the early history of the Internet and the key role of SRI's NIC project in managing and distributing information about net research and standards. "The NIC was privileged to be a part of the beginning adventure and to collaborate with some of the best and brightest at the time," she writes, in a warm first-person recollection.