Pages: p. 4
Open source software has let large system integrators increase their profits through cost savings while reaching more customers with flexible pricing. This has resulted in the emergence of firms providing consulting services to open source projects.
These firms live or die by their ability to recruit and retain appropriate talent. For such talent, life has become both more difficult and exciting as they commit to high-profile open source projects to further their careers.
David Budgen, Michael Rigby, Pearl Brereton, and Mark Turner
Data integration must look beyond static enterprise-based systems to a dynamic services-based solution. For healthcare systems, this means that widely distributed and autonomous agencies, such as hospitals and social services, can deliver functions in response to a user query, while software services help the user identify which functions and sources are relevant to that query.
The authors' research led them to evolve the "data as a service" paradigm and a system for using a broker to deliver services.
Peter Aiken, M. David Allen, Burt Parker, and Angela Mattia
Few organizations manage data as an asset. Instead, most consider data management a maintenance cost.
A small shift in perception can dramatically change how an organization manages data. Properly managed data is an asset that can't be exhausted. Although data can be polluted, retired, destroyed, or become obsolete, it's the one organizational resource that can be repeatedly reused without deterioration, provided the appropriate safeguards are in place.
Nan Zhang and Wei Zhao
New applications like online data collection systems threaten individual privacy. Already companies share data mining models to obtain a richer set of data about mutual customers and their buying habits. The computing community must address data mining privacy before these techniques become widespread. The sticking point is how to protect privacy while preserving the usefulness of data mining results.
The authors took a systemic view of architectural requirements and design principles when investigating privacy preservation issues, exploring possible solutions that would lead to guidelines for building practical privacy-preserving data mining systems.
Jiannong Cao, Yang Zhang, Guohong Cao, and Li Xie
The authors developed a 3D model that captures the main features of cache consistency schemes, which helps to evaluate existing strategies and design new ones. Based on this model, they propose a hybrid and generic strategy: relay-peer-based cache consistency.
Because RPCC uses relay peers between the source hosts and cache hosts to forward updated information, it can divide cache invalidation into two asynchronous procedures. Moreover, RPCC caters to different application requirements by providing a flexible and convenient way to set the consistency level and distance between the source data hosts and relay peers.
Joe Gebis and David Patterson
The recent widening of the 80×86 architecture from 32-bit to 64-bit addresses ensures this dated architecture will last for decades more. The authors suggest that researchers explore how to improve these old architectures to better match the 21st century's IT demands. They provide an example of adding new functionality to these old architectures with a small set of extensions that let the 80×86 and PowerPC support a full vector architecture, primarily by enhancing their multimedia extensions.
Albrecht Mayer, Harry Siebert, and Klaus D. McDonald-Maier
SoCs are now commonly used in a wide range of product sectors. They prevail so strongly in control systems that many market-leading cars now contain up to 100 SoCs or microcontrollers.
For complex systems like automotive power-train control, developers must understand and analyze behavior in all possible scenarios to support high-quality software and reliable products. With the right tools, they can overcome these challenges, ushering in a new world of exciting and dependable products on time and on budget.