The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.12 - December (2010 vol.43)
pp: 23-26
Published by the IEEE Computer Society
ABSTRACT
Topics include energy-saving software that lets PCs work while in sleep mode, a new behavioral-screening technology for homeland-security officials, open source software that lets computers use graphics chips for general processing, and a program that uses wireless technology to help shop owners in rural India more easily access goods to sell.
Energy-Saving PCs Work while Sleeping
Researchers at the University of California, San Diego, have developed software that lets PCs work in sleep mode, cutting their energy consumption by about 70 percent. This could be significant because computer-related equipment accounts for an estimated 80 percent of modern offices' night and weekend electricity consumption.
UCSD's SleepServer software would reduce the energy consumption resulting from networked PCs being left active at all times to enable employee access and other activity, said research scientist Yuvraj Agarwal.
There are many cases in which a company would want computers to be accessible even when unused, Agarwal said. For example, a network administrator might want to back up information or update an application. Or a user might want to connect remotely to work on a file. However, leaving computers active to handle these tasks uses considerable energy.
Agarwal's solution, developed with UCSD professors Stefan Savage and Rajesh Gupta, is to create a lightweight software copy of a PC that maintains a network presence and that has only the application stubs needed for basic tasks such as replying to communications from other computers.


University of California, San Diego, researchers have developed software that lets PCs work in sleep mode, cutting their energy consumption by about 70 percent. UCSD's server-based software maintains a version of a PC's operating system and undertakes tasks on behalf of the sleeping machine. The lightweight image of the PC has the same MAC and IP addresses as the actual computer and runs select applications.

UCSD's server-based, protocol-neutral software maintains a version of a PC's operating system and undertakes tasks on behalf of the desktop machine while it is in low-energy sleep mode. The lightweight image of the PC has the same MAC and IP addresses as the actual computer and can run select applications.
The PC image can run just a few of the real PC's applications while the latter is sleeping. It wakes up the real PC for more complex tasks, such as instant messaging and processing incoming Skype calls. If, for example, a user puts a PC to sleep using conventional means and a Skype transmission arrives, the computer will appear to be offline and won't take the call.
SleepServer maintains a fast connection over the organization's internal network to each desktop so that it can quickly transfer any files it has downloaded on behalf of a PC to the computer. The software also lets users remotely log in to their computer from home, in which case they work with the machine itself, rather than the PC image.
Even the latest low-power computers—not including the monitor—consume about 45 watts when idle. An entire SleepServer—which can host up to 500 PC images—uses just 300 watts, Agarwal noted.
In trials, 30 PCs utilized Sleep-Server for two weeks. Each computer's energy consumption dropped between 27 and 86 percent, with an average reduction of 70 percent, said Agarwal. The savings would be about $60 per computer each year, depending on local energy costs, he added. There are also ancillary savings from air-conditioning systems not having to work as hard as they do now to keep systems cool.
Some products can wake up sleeping computers but not carry out tasks on behalf of a machine in sleep mode, noted Frank Gillett, vice president and principal analyst at Forrester Research. The key question for SleepServer, Gillett said, is whether it will scale.
In UCSD's computer science building, 50 PCs are now running SleepServer. In a few months, according to Agarwal, his team will roll out the system to the entire UCSD campus. He expressed hope of spinning off a company and commercializing the technology within a year.
News Briefs written by Linda Dailey Paulson, a freelance technology writer based in Portland, Oregon. Contact her at ldpaulson@yahoo.com.
Behavioral Screening Technology Promises to Help with Security
The US Department of Homeland Security is working with a new behavioral-screening technology designed to help identify individuals who might intend to undertake harmful activities.
DHS plans to use its Future Attribute Screening Technology as an additional tool to help behavior-detection officers determine changes in a person's baseline behavior that could indicate potential security problems. FAST uses various technologies to gauge small changes in a person's body functions to determine if he or she intends harm. The technology examines a combination of physiological, behavioral, and voice-related cues.
The US government may implement FAST at numerous locations including border crossings and airports. DHS will also make the technology available for use by private organizations at settings such as sports stadiums and concert halls, said Robert Middleton, the department's FAST program manager.
DHS has developed two modules that are part of a mobile laboratory.
In one module, subjects stand on a monitoring pad that uses accelerometers to detect movement changes—fidgeting—as they answer officers' questions. In lab settings, Middleton said, there is a correlation between bad intentions and either increased fidgeting or a lack of fidgeting.
The second FAST module looks at a combination of physiological signals. Subjects are located next to a battery of off-the-shelf, remote-sensing equipment that includes an eye tracker that measures pupil diameter, blink rate, and gaze angle; a thermal camera that gauges factors such as changes in skin temperature and blood flow around the eyes; a system that records cardiac activity; and a low-power medical radar that measures respiration.
Each of these sensors provides data to a real-time decision algorithm that determines whether to refer an individual to secondary screening for additional evaluation.
In laboratory tests in which researchers prepared subjects to assume roles of people with bad intentions and those without such intentions, Middleton said, FAST was 80 percent accurate in identifying whether participants were lying or not lying.
He noted that use of the technology might be controversial for various reasons. However, he added, the DHS Privacy Office has issued an approved Privacy Impact Assessment for FAST. Also, the DHS Office for Civil Rights and Civil Liberties is conducting a Civil Liberties Impact Assessment of the program.


The US Department of Homeland Security is working with a new behavioral-screening technology designed to help identify individuals who might intend to undertake harmful activities. In one module, subjects stand on a monitoring pad that uses accelerometers to detect fidgeting. In another, a battery of off-the-shelf, remote-sensing equipment monitors a combination of physiological signals.

According to Middleton, FAST establishes baseline behaviors for each examined individual, which could make it an age-, race-, and culture-neutral screening tool. However, he noted, additional research is needed to confirm this.
FAST currently keeps data after a subject has been scanned only for research purposes. Once the research has concluded, Middleton said, the data will be destroyed. When in operation, he added, FAST will not store information.
Neither the Electronic Privacy Information Center nor Privacy International responded to requests for comment about the program
DHS currently has no timetable for implementing FAST, said Middleton.
Open Source Software Helps Computers Use Graphics Chips for General Processing
An academic research team has developed an open source software tool that lets computers use the processing power of graphics processing units for purposes other than rendering and manipulating images.
North Carolina State University scientists are continuing to work on their optimizing compiler tool, which would let developers write simple application code without needing to know how to program specifically for GPUs.
In essence, the tool takes an application and translates it into another program that does the same thing, more efficiently on a GPU. Computers use GPUs to generate the complex, data-intensive graphics seen in games, virtual reality, data visualization, and other applications, explained North Carolina State associate professor Huiyang Zhou.
GPUs are much better than CPUs at vector processing—handling multiple sets of numeric operations on large arrays of structured data in parallel, explained Nathan Brookwood, Research Fellow with market-analysis firm Insight 64. They can thus offer considerably more performance than CPUs.
Generally, mainstream general-purpose CPUs offer a peak performance between 20 and 150 Gflops, while mainstream GPUs perform between 500 and 1,500 Gflops, he said.
GPUs have been used to accelerate general-computing programs for several years, and researchers are looking for ways to improve the process. The chips are fast in part because they process data in parallel. They thus must work with applications written to compute this way.
A key challenge is managing parallelism so that the chips can stay busy by efficiently handling the various threads they work with. Also challenging are the effective use of on-chip and off-chip memory, and the even distribution of off-chip memory accesses to avoid overusing some memory controllers while leaving others idle.
Zhou's tool compiles the application code optimally to both manage parallelism and efficiently utilize GPU memory. It enables a single memory access to load what would otherwise be multiple memory accesses, which makes data loading quicker and more efficient, thereby improving performance.
The tool then checks the code to determine if it can handle high-performance memory features such as memory coalescing, which lets a GPU efficiently handle memory-access requests from multiple threads in a single process. If so, the tool recompiles the code so that it can implement these features. It subsequently determines whether data can be reused within and across threads. If so, the tool stores data in on-chip shared memory for fast reuse.
Finally, Zhou explained, the tool recompiles the code so that when the system accesses off-chip memory, the accesses are evenly and efficiently distributed among multiple memory controllers.
Zhou said tests by his research team showed that programs automatically translated by the new tool operated more efficiently than those manually optimized.
Brookwood said the North Carolina State tool works similarly to Nvidia's CUDA and AMD's ATI Stream Technology, proprietary applications that also enable GPUs to accelerate general-computing operations. It appears that Zhou's technique optimizes code more aggressively, he noted. Also, because the approach is not proprietary, it works with any hardware platform.
According to Zhou, his approach is more efficient because it yields optimized code that is better at parallelism management and memory utilization.
He said he eventually plans to release his tool as open source software.
21 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool