Issue No.11 - November (2009 vol.42)
Published by the IEEE Computer Society
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MC.2009.348
The IEEE Computer Society's 13 peer-reviewed magazines cover cutting-edge topics including software, IT and scientific computing applications, microprocessors, design and test, security, Internet computing, machine intelligence, digital graphics, and computer history.
The IEEE Computer Society offers a lineup of 13 peer-reviewed technical magazines that cover cutting-edge topics in computing including scientific applications, design and test, security, Internet computing, machine intelligence, digital graphics, and computer history. Select articles from recent issues of Computer Society magazines are highlighted below.
More than 40 years ago, the term "software engineering" was coined as a challenge to establish software design and development on a firm engineering footing. Twenty years ago, Mary Shaw's classic article "Prospects for an Engineering Discipline of Software" assessed progress toward the establishment of software design and development on a firm engineering footing. Shaw's latest update, in the most recent Software, shows that the profession has made progress but still has much left to do.
Ontologies represent items of knowledge—ideas, facts, things—in a way that defines the relationships and classifications of concepts within a specified domain of knowledge. It's this ability to define various useful relationships among items of knowledge, and to implement these relationships in software, that make an ontology such a powerful gadget in the knowledge manager's tool box. A new tutorial in IT Pro, "Just What Is an Ontology, Anyway?," by Thomas C. Jepsen, addresses several definitions of "ontology" as they relate to computer applications. Jepsen also gives an overview of common ontology-based applications.
Topics covered in CG&A's special issue on recent developments in 3D user interface research include reality- and imagination-based interaction, pointing techniques, analysis of rapid aimed movements, temporal-data visualizations, and navigation of augmented CAD models.
Virtually every Internet application relies on the Domain Name System, but security wasn't a major goal of its original design. The result is several critical vulnerabilities, reviewed in the September/October 2009 S&P special issue on DNS security. To address the security challenges, the security community developed DNS Security Extensions, which are undergoing deployment. Articles summarize key aspects of how to deploy DNSSEC at authoritative servers, resolvers, and public key learning.
E-government and e-participation research aims to provide technologies and tools for more efficient public-administration systems and more participatory decision processes. To this end, interest is growing in how this challenging domain can benefit from emerging "intelligent" technologies, tools, and applications such as the Semantic Web, service-oriented architectures, Web 2.0, and social computing.
In "Transforming E-government and E-participation through IT," IS contributors Vassilios Peristeras, Gregoris Mentzas, Konstantinos A. Tarabanis, and Andreas Abecker note that governments invest heavily in information and communication technologies but are still far from satisfying their constituents.
Cloud computing is location agnostic and provides dynamically scalable and virtualized resources as services over the Internet. In a recent special issue of IC, the guest editors provide broad introductory definitions of cloud computing concepts and introduce other articles that investigate some of the most fundamental issues concerning cloud services' development and deployment.
Photonic networks-on-chip have distinct advantages over electronic NoCs, including communication bandwidth approaching multiple terabits per second with limited power dissipation. In the July/August issue of Micro, Columbia University researchers explore the design of photonic NoCs for delivering a scalable solution to future multicore processors performance requirements in "Photonics NoCs: System-Level Design Exploration."
Steganography is the art and science of writing messages in a way that hides the existence of communication. It can be combined with cryptography to achieve a high level of security. Steganographic schemes abound for hiding messages in images with low dynamic ranges. However, these schemes operate in a fixed luminance range that doesn't work with images of high dynamic ranges, where each image has a different luminance range. An article in the July-September issue of MultiMedia, "A Novel Approach to Steganography in High-Dynamic-Range Images," presents a message-hiding approach for HDR images. The approach supports authentication and a large embedding capacity with low image distortion.
Design & Test
The September/October D&T special issue on 3D IC integration is guest-edited by David Kung of IBM T.J. Watson Research Center and Yuan Xie of Pennsylvania State University. Four articles address different challenges to giving chip architects the flexibility and design options of 3D IC technology as they pursue solutions to the complexities and cost of scaling to 22 nm and beyond.
Nur Touba of the University of Texas at Austin, technical program chair for the 2008 International Test Conference, invited the authors of three outstanding conference papers to update their work for D&T.
In Annals' July–September Anecdotes department, Stanley Mazor recollects his work as liaison to Magnavox in developing the Intel 8244 custom chip for Magnavox's Odyssey2 videogame console. Mazor teamed with Intel chip designer Peter Salmon to deliver the 8244 on time to meet Magnavox's announced plan to release the console by the 1977 holiday season. Intel met its schedule, although the Odyssey2 system didn't appear until 1979.
Mazor joined Intel in 1969. He worked with Ted Hoff and Federico Faggin to deliver the first working CPU, the Intel 4004, in 1971.