August 2008


Guest Editor's Introduction

By Joseph A. Paradiso


Joseph A. Paradiso

Hacking—a word laden with fear and apprehension for most people today, who associate it with malicious and often criminal computer break-ins and related mischief. Those of us from an earlier generation, however, remember this term as something more positive and even laudable. The gifted hacker was a true technological improviser—a collage artist who could rapidly juxtapose bits of found devices, on-hand materials, and pieces of software into new inventions. Hacking of this sort is endemic in American popular media, from TV's MacGyver to Marvel Comics' Tony Stark. This fascination with hacking perhaps dates back to our original settlers who had to build a life out of whatever they found on hand.


Hacking is also crucial in the process of innovation—the first realization of many ideas comes from a quickly hacked "demo" or proof-of-concept built by "lead users" who modify existing devices to solve a particular problem or attain new functionality. Indeed, studies of innovation indicate that, from the Industrial Revolution to modern times, most product innovation has come via lead user "hacking" rather than through established corporate development.


IEEE Computer Society publications, with their attention to professional rigor, aren't generally known as sources for information on hacking—one thinks more of hobbyist magazines like Circuit Cellar and Make, or the many DIY (do-it-yourself) Web sites cropping up. But behind the façade of every talented engineer lurks a weekend hacker, and if one peers a bit more deeply into Computer Society articles, the hacker will often manifest through ingenious workarounds, improvised test apparatus, quick prototypes, and just-in-time techniques. Accordingly, in this month's issue of Computing Now, we have assembled a sampling of articles from the IEEE Computer Society archives that highlight hacking in different ways, crowned by a set of articles from the July–September issue of IEEE Pervasive Computing (edited by Tom Zimmerman, John Heidemann, and myself). These articles celebrate the traditional practice of hacking in engineering and illustrate its ramifications for business, art, media, and innovation.


Hacking is much more than a trivial pastime or computer intrusion—indeed, benevolent hacking is crucial to creative engineering.


Joseph A. Paradiso is an associate professor at the MIT Media Laboratory, where he directs the Responsive Environments Group and codirects the Things That Think Consortium. Contact him at joep at media dot mit dot edu.



Theme — HACKING FOR INNOVATION

   

Hacking the Nintendo Wii Remote

By Johnny Chung Lee
From the July–September 2008 issue of IEEE Pervasive Computing

In November 2006, Nintendo released its fifth home videogame console, the Nintendo Wii. The company's previous game console, the Gamecube, hadn't fared well in terms of market share against the much higher-powered alternatives released by its competitors, Microsoft and Sony. At first the Wii also seemed significantly underpowered relative to its competitors. However, one year later it became the market leader of its console generation, selling over 20 million units worldwide. This success is largely attributable to the innovative interactive technology and game-play capabilities introduced by the console's game controller, the Wii remote.


Hacking, Mashing, Gluing: Understanding Opportunistic Design

By Björn Hartmann, Scott Doorley, and Scott R. Klemmer
From the July–September 2008 issue of IEEE Pervasive Computing

Opportunistic practices in interactive system design include copying and pasting source code from public online forums into your own scripts, taking apart consumer electronics and appropriating their components for design prototypes, and "Frankensteining" hardware and software artifacts by joining them with duct tape and glue code. We consider these opportunistic practices part of mashup design. Although many ubiquitous computing practitioners have engaged in these practices, design tools and software engineering research don't traditionally address them.


Engineered Reality: Prototyping Inventions for Television

By Andrew G. (Zoz) Brooks and Joe Grand
From the July–September 2008 issue of IEEE Pervasive Computing

Invention—the act of bringing something new into existence through ingenious thinking—conjures up images of the reclusive genius slaving away in a basement laboratory (or perhaps, these days, in a corporate R&D section). The inventor starts with a problem and ends with a tangible product that places new powers in the hands of the people who will eventually use it. This traditional view, however, omits the wide class of lead users, or hackers—individuals who adapt, modify, and improve existing inventions to create new or better versions, or to solve other problems that the invention was never intended to address. The prevalence of these intermediate tinkerers shows that people are fascinated not only with the result of the invention process, but also the process itself. The television industry has attempted to fill this niche with programming dedicated to showcasing and elevating the traditions of both innovation and hacking.


Open Source Software in Industry

By Christof Ebert
From the May/June 2008 issue of IEEE Software

Innovation is driven by open source software. Today, we rarely see traditional software development, where one company designs and builds an application or product from scratch and then integrates and evolves it within a defined market. That's no longer a healthy business case. Engineering life cycles, business models, distribution channels, and services have dramatically changed. Software products are so complex that it would be impossible to develop, maintain, and evolve only unique software. Companies are moving up to the applications and service level, increasingly relying on reused, externally available components. Typical schemes today imply that software companies work in a market segment, building their specific products or services on top of existing frameworks, libraries, Web services, and other components. Many companies focus on a specific value creation model built upon reusing components with as little reworking as possible. Competition is fierce; it demands that we reuse software on a level that even reuseprotagonists some 10 years ago would have considered hard to believe.


What Hackers Learn that the Rest of Us Don't: Notes on Hacker Curriculum

By Sergey Bratus
From the July/August 2007 issue of IEEE Security & Privacy

The hacker culture has accumulated a wealth of efficient practices and approaches to computer technologies—in particular, to analysis, reverse engineering, testing, and software and hardware modification—that differ considerably from those of both the IT industry and traditional academia. (Some groups in academia share and have influenced elements of the hacker culture, but these are exceptions, not the rule.) In particular, the "curriculum" a hacker experiences while learning his skills is substantially different from that of the typical computer science student. Yet, in many respects, this curriculum produces impressive results that enrich other cultures, and its influence and exchange of ideas with the more traditional cultures are growing. Thus, understanding and describing these curriculum and approaches is becoming more important day by day.


A review of T.F. Peterson's Nightwork: A History of Hacks and Pranks at MIT

By Dag Spicer
From the April–June 2007 issue of IEEE Annals of the History of Computing

This glimpse into the Massachusetts Institute of Technology's famous "hacker" culture showcases more than a century of silliness on the MIT campus. In an important, albeit lighthearted, study of MIT and its engineering culture, this book supplements an earlier book by Brian Leibowitz covering much the same topic—The Journal of the Institute for Hacks, TomFoolery, and Pranks at MIT, MIT Museum, 1990—and a second book, Is This the Way to Baker House? Most of the material in these two books is covered in this latest book but in a more conversational (and less technically detailed) approach.




Keith Farkas, IEEE Pervasive Computing magazine's liaison to Computing Now, helped to assemble the articles in this theme. Keith is a staff engineer at VMWare. Contact him at kfarkas at vmware dot com.

 


What's New

   

ZoneTag's Collaborative Tag Suggestions: What is This Person Doing in My Phone?

By Mor Naaman and Rahul Nair
From the July–September 2008 issue of IEEE MultiMedia

Thousands of people visit the Golden Gate Bridge every day. Many (if not most) of these visitors take photographs of the bridge from the same picture spots. Despite being eight years into the 21st century, the overwhelming majority of these photographers will return home and load the photos onto some folder on their computer's hard drive, assigning each photo a highly nondescript name such as DSC02211.jpg. Surely, we can do better than that.


Using Static Analysis to Find Bugs

By Nathaniel Ayewah, William Pugh, David Hovemeyer, J. David Morgenthaler, and John Penix
From the September/October 2008 issue of IEEE Software

Software quality is important, but often imperfect in practice. We can use many techniques to try to improve quality, including testing, code review, and formal specification. Static-analysis tools evaluate software in the abstract, without running the software or considering a specific input. Rather than trying to prove that the code fulfills its specification, such tools look for violations of reasonable or recommended programming practice. Thus, they look for places in which code might dereference a null pointer or overflow an array. Static-analysis tools might also flag an issue such as a comparison that can't possibly be true. Although the comparison won't cause a failure or exception, its existence suggests that it might have resulted from a coding error, leading to incorrect program behavior.


When Web 2.0 Becomes Web Uh-Oh

By Greg Goth
From the August issue of IEEE Distributed Systems Online

The promise of cross-organizational computing and communications has long been a Holy Grail for network architects. From the dawn of the Arpanet to today's deployments of service-oriented architectures (SOAs) and remotely hosted applications, wider reuse of standards-compliant software components has been a constant goal. The rise of social networks and Web 2.0 principles are the latest trends in reusing software on nonhierarchical architectures.


Computer Scientist, Software Engineer, or IT Professional: Which Do You Think You Are?

By Keith W. Miller and Jeffrey Voas
From the July/August 2008 issue of IT Professional

In our first year of publication, IT Professional included an article by Bill Lowell and Angela Burgess titled, 'A Moving Target: Studies Try to Define the IT Workforce." In that article, the authors complained that job titles were being invented and qualifications were shifting daily. They suggested a "core four" out of the plethora of job titles: computer scientist, computer engineer, systems analyst, and programmer. In a more recent article, Robert Glass, Venkataraman Ramesh, and Iris Vessey broke down the industry using the academic subdivisions of computer science, software engineering, and information systems. Given the fast rate of change in this industry, perhaps it's an appropriate time to revisit what an "IT professional" really is.


Exposing Fortran Derived Types to C and Other Languages

By Alexander Pletzer, Douglas McCune, Stefan Muszala, Srinath Vadlamani, and Scott Kruger
From the July/August 2008 issue of Computing in Science & Engineering

In the past decade, scientific codes written in Fortran have increasingly come to rely on derived types, also known as compound or record types, for multiple reasons. Using derived types can simplify the API and let the code evolve without breaking backward compatibility. Adding new features might involve merely extending the derived types or attaching new methods (the existing methods' signatures needn't change). Although derived types also provide a mechanism for data hiding and encapsulation, perhaps more importantly, they're a step toward object-oriented programming whereby a composite object is created, has a life span during which it can be altered, and is finally destroyed. The combination of data members and procedures acting on the derived type form a "class" in the same sense as the class keyword in C++, Java, and Python, so programmers can emulate many object-oriented constructs in languages such as Fortran 90/95 (or C, for that matter) that don't formally have an object-oriented syntax.


Joseph Weizenbaum (1923–2008)

By Joel Moses and Jeff Meldman
From the July/August 2008 issue of IEEE Intelligent Systems

Joseph Weizenbaum was born in Berlin in 1923 and died there on 5 March 2008 at the age of 85. His German-Jewish family left Germany in 1935 and came to the US. Joe made major contributions to computer science and to the applications of computers, but was best known as a critic of AI. While working for General Electric in the 1950s, he helped develop one of the first banking applications using the magnetically encoded fonts on checks.


Extensible GUIs for Remote Application Control on Mobile Devices

By Fabrizio Lamberti and Andrea Sanna
From the July/August 2008 issue of IEEE Computer Graphics and Applications

A chasm exists between the desktop and mobile worlds in terms of hardware performance, graphics capabilities, and input devices. Some kinds of software, such as 3D graphics or advanced word-processing tools, are still out of reach for handheld devices. So, although redesigning a desktop application for mobile environments is often possible, mobile devices' limitations discourage using such software in nomadic scenarios.


Hacked Devices, A New Game Experience, and a Wi-Fi Detector Shirt

Maria Ebling and Mark Corner, eds.
From the July–September 2008 issue of IEEE Pervasive Computing

This issue's New Products department covers several hacks for devices, new and old, and a couple of things that should appeal to the hacker tradition: the jDome game and a Wi-Fi detector shirt.


HandTalk: Assistive Technology for the Deaf

By David K. Sarji
From the July 2008 issue of Computer

Although embedded computing technology is widely used today in many safety-critical applications such as avionics and antilock brakes, researchers have only begun to explore its potential. As more commonplace items such as washing machines and thermostats incorporate embedded processing capability, the number of such systems could grow exponentially.


Open Wireless Networks on University Campuses

By Kjell J. Hole, Lars-Helge Netland, Yngve Espelid, André N. Klingsheim, Hallvar Helleseth, and Jan B. Henriksen
From the July/August 2008 issue of IEEE Security & Privacy

Universities are particularly interested in wireless communication networks because they let students using wireless-enabled mobile terminals download lecture slides, watch educational audio and video programs, and take online practice quizzes at any time and from anywhere on campus. This can both reduce paper handouts and simplify paperless assignments and submissions. Wireless networks can also strengthen teamwork among students and faculty, making it easier for them to email each other with preliminary results, use chat channels to discuss problems, and readily access information resources during problem-solving sessions. When students and faculty can access all the information they need via their own mobile terminals, universities can even consider retiring their expensive computer labs.


ERP is Dead, Long Live ERP

By Paul Hofmann
From the July/August 2008 issue of IEEE Internet Computing

Some mature information technologies, such as enterprise resource-planning (ERP) systems and relational databases (RDBs), are now undergoing commoditization, much like what happened in the automotive and chemical industries over the past 15 years. This trend is accelerated by reduced IT spending because of slowing economic growth.

In such an environment, market leaders in computing, networking, and telecommunications need to increase investment in disruptive markets and business models. Growing in a down market will require innovation; leaders must pick the right breakthrough technologies from current trends, including high-performance computing, pervasive connectivity, Web services, and service-oriented architecture (SOA).


Who Knew this "Experiment" would be so Successful?

By Betsy Weitzman
From the July/August 2008 issue of IEEE Design & Test

The Gigascale Systems Research Center (GSRC) truly represents a transformational success story in a unique industry-government-academia partnership. It is remarkable to observe how this research center has evolved over time. Not only has it become ever more responsive to the needs of its diverse sponsor community, but it has also infused its own visionary ideas into the process of redefining itself at critical junctures to drive truly innovative and relevant research in the system design arena.


Voting and Economic Asymmetry

By Shane Greenstein
From the July/August 2008 issue of IEEE Micro

Outside of a few square miles around Palo Alto, the vast majority of presidential voters decide their votes on the basis of policies other than those that shape high-tech markets. The list this year is long: the war in Iraq, the US healthcare system, the security of the nation in light of terrorist threats, the potential collapse of the financial system under the weight of unserviceable mortgages, and numerous social issues.

If you look a little deeper you will notice an asymmetry. Despite the unimportance of high-tech policy to voting decisions, the election results have enormous consequence for high tech. That asymmetry has been with us for many years, and it explains a lot of what we see in federal government policy. It shapes the tenor of elections, the prevalence of backroom decision making, and the type of policies for high-tech markets the US government makes.

This will take some explaining.