GUEST EDITOR'S INTRODUCTION


by Kevin Rudd

November 2008—CHALLENGES AND OPPORTUNITIES IN COMPUTER ARCHITECTURE


Kevin Rudd

Computer architecture is no different from any other subject area—there are successes and failures, ongoing research, and challenges and opportunities. This month in Computing Now, the focus is on challenges and opportunities in computer architecture and, as you might imagine, there are many. In many ways, Moore's Law, which characterized the growth rate in transistor count over time, has been the driving force behind many of the advances in computer architecture since it was originally stated in 1965. It has motivated many businesses and many talented engineers in a wide range of disciplines to continue to improve both the fundamental technologies involved and our ability to take advantage of those improvements. Today, we have gotten to the point where our computing devices have unprecedented power, often through the availability of multiple cores per chip. The challenge to architects as well as users is to use this power for good and not for evil.

The articles in this month's theme discuss these challenges and opportunities from several perspectives. "Some Computer Science Issues in Creating a Sustainable World" addresses some environmental challenges and opportunities and issues both "mitigating the direct negative impact of computers ... and the indirect positive impact of computers." "Education, Outreach, and Training for High-Performance Computing" discusses some education, outreach, and training challenges and opportunities in high-performance computing. "The Concurrency Challenge" covers some of the challenges and opportunities associated with the current state of the art of parallel programming. "Using Asymmetric Single-ISA CMPs to Save Energy on Operating Systems" addresses the challenges and opportunities of balancing useful performance with energy consumption with the asymmetric requirements of applications and operating systems. "Virtualization Sparks Security Concerns" discusses the challenges and opportunities of securing virtual machines, given that such systems can't always be secured in the same way that physical systems can. And the three-part article "Why Computer Architecture Matters" provides a hands-on demonstration of "several aspects of computer architecture that scientific computer programmers should bear in mind when execution speed is important."

These articles provide a snapshot of many of the challenges and opportunities in computer architecture and, as an added bonus, a demonstration of why computer architecture matters. I hope that you enjoy them!

Kevin W. Rudd is a computer architect at Intel Corporation and is currently on sabbatical at Technische Universiteit Delft as a visiting professor. He's on the IEEE Micro editorial board. Contact him at ke6fzi at gmail dot com.



Theme — CHALLENGES AND OPPORTUNITIES IN COMPUTER ARCHITECTURE

   

Some Computer Science Issues in Creating a Sustainable World

by Jennifer Mankoff, Robin Kravets, and Eli Blevis
From the August 2008 issue of Computer

Among the biggest challenges the world faces today are the climate crisis and the broader issues of environmental sustainability raised in books such as Jared Diamond's Collapse: How Societies Choose to Fail or Succeed (Viking, 2004). Part of the solution to this problem depends on climate science, breakthrough technologies, and policy changes.


Education, Outreach, and Training for High-Performance Computing

by David Joiner, Charles Peck, Thomas Murphy, and Paul Gray
From the September/October 2008 issue of Computing in Science & Engineering

In the many-core future, parallel computing will be available for everyone, as processors with 16 or more cores run people's favorite applications and virtualization spreads from the data center to the entertainment center. There will be a great need for powerful high-performance computing (HPC)-enabled tools and curricula. (For an example, see "Applying Computational Science to Education: The Molecular Workbench Paradigm.")


The Concurrency Challenge

by Wen-mei Hwu, Kurt Keutzer, and Timothy G. Mattso
From the July/August 2008 issue of IEEE Design & Test of Computers

The semiconductor industry has settled on two main trajectories for designing microprocessors. The multicore trajectory began with two-core processors, with the number of cores doubling with each semiconductor process generation. A current exemplar is the recent Intel Core 2 Extreme microprocessor with four processor cores, each of which is an out-of-order, multiple-instruction-issue processor supporting the full X86 instruction set. The many-core trajectory begins with a large number of far smaller cores and, once again, the number of cores doubles with each generation. A current example is the Nvidia GeForce 8800 GTX graphics processing unit (GPU) with 128 cores, each of which is a heavily multithreaded, single-instruction-issue, in-order processor that shares its control and instruction cache with seven other cores.


Using Asymmetric Single-ISA CMPs to Save Energy on Operating Systems

by Jeffrey C. Mogul, Jayaram Mudigonda, Nathan Binkert, Parthasarathy Ranganathan, and Vanish Talwar
From the May/June 2008 issue of IEEE Micro

Our computer systems need to do better at balancing useful performance with energy consumption. In a recent paper, Barroso and Hölzle pointed out that most servers operate most of the time at relatively modest utilizations-seldom entirely idle, and seldom fully utilized. However, typical servers consume almost as much energy when idle as they do when fully loaded, so their energy efficiency (useful work per joule) suffers in their normal operating region. Barroso and Hölzle argued that designers should "develop machines that consume energy in proportion to the amount of work performed."


Virtualization Sparks Security Concerns

by Steven J. Vaughan-Nichols
From the August 2008 issue of Computer

Virtualization is rapidly becoming a standard technology for businesses. The technology lets a single PC or server simultaneously run multiple operating systems or multiple sessions of a single OS. This lets users put numerous applications and functions on a PC or server, instead of having to run them on separate machines as in the past.


Why Computer Architecture Matters

by Cosmin Pancratov, Jacob M. Kurzer, Kelly A. Shaw, and Matthew L. Trawick
From the May/June 2008, July/August 2008, and September/October 2008 issues of Computing in Science & Engineering

When we write computer programs for data analysis or simulations, we typically translate the relevant mathematical algorithm into computer code as directly as possible, with little regard for how the computer will actually perform the computations. We choose to ignore such details as how individual calculations are divided among the several arithmetic and floating-point units (FPUs) on the CPU, how data is shuttled between the CPU and the main memory, and how recently used data is temporarily stored in the computer's various levels of memory caches. These questions fall under the broad heading of "computer architecture," and in many cases, our intentional ignorance serves us well. A compiler can usually translate our programs into reasonably efficient machine code quite effectively—and for many short calculations, modern computers are already many times faster than they need to be. For longer computations, however, programming with a little bit of attention to the architecture can sometimes produce gains in execution speed that are significant enough to make the extra effort worthwhile.


 


What's New

   

Licensing Software Engineers?

by Philippe Kruchten
From the November/December 2008 issue of IEEE Software

In your trajectory to further your professional development, should you aim to become a licensed professional software engineer? Before I can answer this question, I must clarify what this really means.


Moving Scientific Codes to Multicore Microprocessor CPUs

by Paul R. Woodward, Jagan Jayaraj, Pei-Hung Lin, and Pen-Chung Yew
From the November/December 2008 issue of Computing in Science & Engineering

The IBM Cell processor represents the first and most extreme of a new generation of multicore CPUs. For scientific codes that can be formulated in terms of vector computing concepts, as far as we know, the Cell is the most rewarding. In this article, our team at the University of Minnesota presents a method for implementing numerical algorithms for scientific computing so that they run efficiently on the Cell processor and other multicore CPUs. We present our method using the Piecewise-Parabolic Method (PPM) gas dynamics algorithm but believe that many other algorithms could benefit from our approach. Nevertheless, the code transformations are difficult to perform manually, so we're undertaking an effort to build simplified tools to assist in at least the most tedious of the code transformations involved.


IT Predictions for 2009

by Phillip A. Laplante
From the November/December 2008 issue of IT Professional

Earlier this year, I made a set of predictions ("IT Predictions for 2008," IT Professional, vol. 10, no. 1, 2008, pp. 62–64) that involved various scenarios of interest to the IT professional and were intended to remain valid for the next couple of years. It seems appropriate, then, to close the year out by seeing how accurate I've been so far and to formulate an updated forecast for 2009.


The Creation Process of Chinese Calligraphy and Emulation of Imagery Thinking

by Jun Dong, Miao Xu, Xian-jun Zhang, Yan-qing Gao, and Yun-he Pan
From the November/December 2008 issue of IEEE Intelligent Systems

Chinese calligraphy is a unique shape-based form of written expression used continuously for several thousand years. Examples of this renowned art form survive in various tablets and documents preserved throughout history. Today, professional calligraphers reference these materials in advertising, documents, processing environments, and the generation of new calligraphic styles.


Experiencing the Past through the Senses: An M-Learning Game at Archaeological Parks

by Carmelo Ardito, Paolo Buono, Maria F. Costabile, Rosa Lanzilotti, Thomas Pederson, and Antonio Piccinno
From the October–December 2008 issue of IEEE MultiMedia

M-learning—the combination of e-learning with mobile technologies—captures the very nature of e-learning by providing users with independence from the constraints of time and location. To exploit the potential of mobile technologies for learning, researchers must define new teaching and learning techniques. The Explore! m-learning system implements an excursion-game technique to help middle school students (ages 11 through 13) acquire historic knowledge while playing in an archaeological park.


Financial Crisis and the Role of Risk-Management Software

by Greg Goth
From the November 2008 issue of IEEE Distributed Systems Online

The global financial services industry, perhaps more than any other, illustrates the overwhelming power and egalitarian nature of modern telecommunications. Anyone with an Internet connection and a cash account can buy and sell any number of financial instruments online. They can transfer assets from bank to bank in the blink of an eye. In an ad for a large US-based online trading firm, a customer extols his ability to sit in his home at midnight and trade shares on the Hang Seng exchange in Hong Kong.


From Cells to Cell Processors: The Integration of Health and Video Games

by Ben Sawyer
From the November/December 2008 issue of IEEE Computer Graphics and Applications

Too often, conventional wisdom states that video and computer games offer no educational benefit, waste time, and, worst of all, are inherently unhealthy. As games have become more powerful, providing exciting input methods and pushing out more graphical richness and realism, the chorus of naysayers has increased in volume, if not numbers. Thankfully, the curves provided by Moore's law and Metcalfe's law (a communication network's value is proportional to the square of the number of its users) are accompanied by growing appreciation for video and computer games beyond their entertainment underpinnings. The combined growth in the power of games and the wider understanding of their potential impact is creating a proper rebuttal to the conventional wisdom that games are simple stuff for kids.


The Future is Reconfigurable

by James Figueroa
Computing Now Exclusive Content — November 2008

It won't be easy for students to create a new clustering algorithm for FPGA CAD tools, but some ambitious minds are certain to try. Accomplishing that task would give the winning team bragging rights for the first design competition at the International Conference on Field-Programmable Technology (http://vda.ee.nctu.edu.tw/ICFPT/FPT08.htm) in December, along with the opportunity to have their algorithm published. More than that, such an accomplishment would illustrate just how far the world of reconfigurable computing has come and show what's possible in the future.


Slouching Toward a Dystopian Internet

by Shane Greenstein
From the September/October 2008 issue of IEEE Micro

Despite the long-term success of the commercial Internet, or perhaps because of its success, there is a vigorous and ongoing discussion about avoiding a dystopian future.


Cross-Border Data Flows and Increased Enforcement

by Peter McLaughlin
From the November/December 2008 issue of IEEE Security & Privacy

The term "privacy" is subject to many definitions and descriptions. According to Jim Harper of the Cato Institute, "Properly defined, privacy is the subjective condition people experience when they have the power to control information about themselves and when they exercise that power consistent with their interests and values." Regardless of whether we agree with Harper about the precise phrasing, the definition he posits reflects internationally accepted ideas that the collection, use, sharing, and protection of personal information should be subject to some degree of informed notice to and consent from the affected individual. (For more on this, see the "For further reading" sidebar.) The EU Data Protection Directive takes a somewhat different tack and defines personal data as data relating to an identified or identifiable individual, and then allocates a series of rights to the individual regarding the data, particularly regarding notice, consent, and other principles intended to grant an individual reasonable control over the data relating to him or her. (For more on the directive, see http://ec.europa.eu/justice_home/fsj/privacy/index_en.htm.)


The Dea[r]th of Human Understanding

by Neville Holmes
From the October 2008 issue of Computer

Three years ago in this column I expressed concerns about the personal effects of videogaming (Nov. 2005, pp. 116, 114–115). Since then, those concerns have grown. Recent reading suggests that my previously described concerns are too narrow and that the increasing use of digital technology is having a widespread and degrading influence on humanity.


Telework: A Productivity Paradox?

by Stephen Ruth and Imran Chaudhry
From the November/December 2008 issue of IEEE Internet Computing

Is it a solution to our reliance on foreign oil? A reversal of the trend toward outsourcing US jobs overseas? A technology fix that will change the way we live? The subject is telework, and a growing number of advocates feel that it's an underreported answer to several major challenges. Telework — moving the work to the workers instead of the workers to work — is a term originated by researcher Jack Niles more 30 years ago (http://jala.com/definitions.php). Also known as telecommuting, it's widely used in the private sector and is gaining popularity in the public sector, as more government branches mandate its implementation. A Washington state senator, noting that the average commute in his Central Puget Sound constituency had risen significantly in three years, said "I can't tell you the number of people who told me if they could just work from home, they wouldn't have to get on the road every day." The senator believes that telecommuting is a better solution than building new roads.


OCP-IP NoC Benchmarking WG Activities

by Ian R. Mackintosh
From the September/October 2008 issue of IEEE Design & Test of Computers

The Open Core Protocol (OCP) standard addresses the world of heterogeneous processor, multicore SoC development. Since the announcement of the OCP-IP organization in December 2001, eight technical working groups (WGs) have emerged to address activities on developing tools, technologies, and products supporting the OCP standard, with valuable deliverables available free to members.


Urban Social Tapestries

by Alice Angus, Dikaios Papadogkonas, George Papamarkos, George Roussos, Giles Lane, Karen Martin, Nick West, Sarah Thelwall, Zoetanya Sujon, and Roger Silverstone
From the October–December 2008 issue of IEEE Pervasive Computing

Urban Tapestries (UT) is an exploration into the potential costs and benefits of public authoring—that is, mapping and sharing local knowledge, memories, stories, sensed information, and experiences. It aims to reveal the potential of pervasive computing to create and support relationships that surpass established social and cultural boundaries and enable new practices around place, identity, and community. Proboscis, an artist-led studio, conceived and initiated UT in 2002, and since then, has further developed the concept in collaboration with several technical, academic, and civil society partner organizations. The core enabler is a pervasive computing platform developed specifically to support public authoring in its many expressions.