NEWS


Computing Now Exclusive Content — February 2010

News Archive

July 2012

Gig.U Project Aims for an Ultrafast US Internet

June 2012

Bringing Location and Navigation Technology Indoors

May 2012

Plans Under Way for Roaming between Cellular and Wi-Fi Networks

Encryption System Flaw Threatens Internet Security

April 2012

For Business Intelligence, the Trend Is Location, Location, Location

Corpus Linguistics Keep Up-to-Date with Language

March 2012

Are Tomorrow's Firewalls Finally Here Today?

February 2012

Spatial Humanities Brings History to Life

December 2011

Could Hackers Take Your Car for a Ride?

November 2011

What to Do about Supercookies?

October 2011

Lights, Camera, Virtual Moviemaking

September 2011

Revolutionizing Wall Street with News Analytics

August 2011

Growing Network-Encryption Use Puts Systems at Risk

New Project Could Promote Semantic Web

July 2011

FBI Employs New Botnet Eradication Tactics

Google and Twitter "Like" Social Indexing

June 2011

Computing Commodities Market in the Cloud

May 2011

Intel Chips Step up to 3D

Apple Programming Error Raises Privacy Concerns

Thunderbolt Promises Lightning Speed

April 2011

Industrial Control Systems Face More Security Challenges

Microsoft Effort Takes Down Massive Botnet

March 2011

IP Addresses Getting Security Upgrade

February 2011

Studios Agree on DRM Infrastructure

January 2011

New Web Protocol Promises to Reduce Browser Latency

To Be or NAT to Be?

December 2010

Intel Gets inside the Helmet

Tuning Body-to-Body Networks with RF Modeling

November 2010

New Wi-Fi Spec Simplifies Connectivity

Expanded Top-Level Domains Could Spur Internet Real Estate Boom

October 2010

New Weapon in War on Botnets

September 2010

Content-Centered Internet Architecture Gets a Boost

Gesturing Going Mainstream

August 2010

Is Context-Aware Computing Ready for the Limelight?

Flexible Routing in the Cloud

Signal Congestion Rejuvenates Interest in Cell Paging-Channel Protocol

July 2010

New Protocol Improves Interaction among Networked Devices and Applications

Security for Domain Name System Takes a Big Step Forward

The ROADM to Smarter Optical Networking

Distributed Cache Goes Mainstream

June 2010

New Application Protects Mobile-Phone Passwords

WiGig Alliance Reveals Ultrafast Wireless Specification

Cognitive Radio Adds Intelligence to Wireless Technology

May 2010

New Product Uses Light Connections in Blade Server

April 2010

Browser Fingerprints Threaten Privacy

New Animation Technique Uses Motion Frequencies to Shake Trees

March 2010

Researchers Take Promising Approach to Chemical Computing

Screen-Capture Programming: What You See is What You Script

Research Project Sends Data Wirelessly at High Speeds via Light

February 2010

Faster Testing for Complex Software Systems

IEEE 802.1Qbg/h to Simplify Data Center Virtual LAN Management

Distributed Data-Analysis Approach Gains Popularity

Twitter Tweak Helps Haiti Relief Effort

January 2010

2010 Rings in Some Y2K-like Problems

Infrastructure Sensors Improve Home Monitoring

Internet Search Takes a Semantic Turn

December 2009

Phase-Change Memory Technology Moves toward Mass Production

IBM Crowdsources Translation Software

Digital Ants Promise New Security Paradigm

November 2009

Program Uses Mobile Technology to Help with Crises

More Cores Keep Power Down

White-Space Networking Goes Live

Mobile Web 2.0 Experiences Growing Pains

October 2009

More Spectrum Sought for Body Sensor Networks

Optics for Universal I/O and Speed

High-Performance Computing Adds Virtualization to the Mix

ICANN Accountability Goes Multinational

RFID Tags Chat Their Way to Energy Efficiency

September 2009

Delay-Tolerant Networks in Your Pocket

Flash Cookies Stir Privacy Concerns

Addressing the Challenge of Cloud-Computing Interoperability

Ephemeralizing the Web

August 2009

Bluetooth Speeds Up

Grids Get Closer

DCN Gets Ready for Production

The Sims Meet Science

Sexy Space Threat Comes to Mobile Phones

July 2009

WiGig Alliance Makes Push for HD Specification

New Dilemnas, Same Principles:
Changing Landscape Requires IT Ethics to Go Mainstream

Synthetic DNS Stirs Controversy:
Why Breaking Is a Good Thing

New Approach Fights Microchip Piracy

Technique Makes Strong Encryption Easier to Use

New Adobe Flash Streams Internet Directly to TVs

June 2009

Aging Satellites Spark GPS Concerns

The Changing World of Outsourcing

North American CS Enrollment Rises for First Time in Seven Years

Materials Breakthrough Could Eliminate Bootups

April 2009

Trusted Computing Shapes Self-Encrypting Drives

March 2009

Google, Publishers to Try New Advertising Methods

Siftables Offer New Interaction Model for Serious Games

Hulu Boxed In by Media Conglomerates

February 2009

Chips on Verge of Reaching 32 nm Nodes

Hathaway to Lead Cybersecurity Review

A Match Made in Heaven: Gaming Enters the Cloud

January 2009

Government Support Could Spell Big Year for Open Source

25 Reasons For Better Programming

Web Guide Turns Playstation 3 Consoles into Supercomputing Cluster

Flagbearers for Technology: Contemporary Techniques Showcase US Artifact and European Treasures

December 2008

.Tel TLD Debuts As New Way to Network

Science Exchange

November 2008

The Future is Reconfigurable

Faster Testing for Complex Software Systems

by George Lawton

Testing applications for large software systems can be a time-consuming and expensive process that ultimately delays the release date. As software systems incorporate more components, one of the biggest challenges is knowing which combinations to test. A University of Nebraska researcher is developing new algorithms that promise to improve this testing process.

In a large software system, each component might perform flawlessly by itself, but interaction faults can emerge when components work with each other. Developers must test each component not only separately but also in each possible combination with different components. Consequently, said Myra Cohen, assistant professor of computer science at the University of Nebraska, the number of tests required can grow exponentially as systems introduce different components and new configurations for them.

Cohen said that no one has quantified the cost of interaction faults specifically. But in 2002, a US National Institute of Standards and Technology study found that computer bugs in general cost the US economy between $22.2 and $59.5 billion in repairs, lost business, and downtime per year. In some cases, the consequences of software failures are grave, said Cohen. For example, software errors have caused medical equipment to overdose patients with radiation and led to the explosion of an Ariane 5 rocket.

Testing Complexity

Cohen is working on two related projects to improve complex systems' testing. The first one, sponsored by the US National Science Foundation (NSF), is developing algorithms to sample the configurable-components space and test it efficiently because the number of possible combinations is so large. End users, system administrators, and application providers can all manipulate a system's configuration. The NSF project is trying to find ways of ensuring that combinations with a higher likelihood of failure are tested early in the development process.

The second project, Just Enough Testing (JET), is sponsored by the US Air Force Office of Scientific Research and focused on testing software product lines, which are a subset of configurable systems. They're built from a finite component set that the software developer controls. For example, cell phone developers might have different component types for graphics rendering, messaging, and Web browsing, which they could combine in different ways to create a family of cell phones.

Certain faults aren't revealed unless you put the correct configurations together, said Cohen. "We would like to be able to sample that whole configuration space so that when the application ships, we have the confidence that those faults have been discovered."

The JET project is creating algorithms to intelligently reuse tests from prior software component combinations. These algorithms can identify specific component combinations to test 300 times faster than existing techniques. They can also reduce the number of required test cases by an average of 5 percent across 50 different benchmarks. Although other algorithms also do sampling, Cohen said, they don't work well with dependencies across multiple component types.

JET focuses on new feature combinations to improve the entire product family, said Cohen. "The idea is to do more testing in the same amount of time so we can improve the quality of the software."

After testing one combination of components, an organization conducts a series of tests of different combinations and compiles all their results. The organization knows which combination it has tested in different product versions. On a new product, it can focus on testing combinations of features that haven't yet been tested together.

"JET is an approach to test strategies and test-case specifications that uses pragmatism and appropriateness as its central pillars," said Ivan Ericsson, director at the SQS Software Quality Systems consultancy.

Future Work

Cohen said the research is still in the early phases. One challenge is tuning the algorithms against realistic product lines. Another is demonstrating the reliability of test results in new software component combinations. "We're asking people to reuse this testing information,” Cohen explained, "so we have to be able to show that it is still valid."

Ericsson pointed to another potential problem of managers taking the JET ideas out of context. "There's a real risk that JET is interpreted as 'writing less tests,'" he said, "when it actually means 'write the appropriate level of tests.'"

David R. Luginbuhl, program manager for systems and software at the US Air Force Office of Scientific Research, said, "We think this will help in the development of software and deployment of software in the future. This could eventually be incorporated into a software testing suite."

In the long run, the number of software component combinations will continue to grow with the rise of the Web and more complex systems integration. Cohen said that this type of research will help scale software testing across these larger systems.

George Lawton is a freelance technology writer based in Guerneville, California. Contact him at glawton@glawton.com.