Pages: p. 7
Nearly half a century after Marvin Minsky predicted that computers would be as smart as humans, computing systems still cannot pass the Turing test. Despite impressive achievements in robotics, mathematical theorem proving, scientific classification, and advanced user interfaces, artificial intelligence remains elusive.
Scientists and engineers have nearly realized Vannevar Bush's dream of a universal multimedia data-processing machine with the Internet and the World Wide Web. It is now possible to foresee the development of highly secure, highly available, self-programming, self-managing, and self-replicating computer networks. However, creating intelligent networks that can program, manage, and replicate themselves remains a major challenge.
The China Knowledge Grid Research Group, established in 2001, is exploring the operating principles of this future interconnection environment.
Thomas Anderson, Larry Peterson, Scott Shenker, and Jonathan Turner
The prospects for significant change in the Internet's existing architecture appear slim. In addition to requiring changes in routers and host software, the Internet's multiprovider nature requires that ISPs jointly agree on any architectural change.
The canonical story about architectural research's potential impact has long maintained that if testbed experiments show an architecture to be promising, ISPs and router vendors might adopt it. This story might have been realistic in the Internet's early days, but not now: Not only is reaching agreement among the many providers difficult to achieve, attempting to do so also removes any competitive advantage from architectural innovation.
By providing easy access to virtual testbeds, the authors hope to foster a renaissance in applied architectural research that extends beyond incrementally deployable designs. Moreover, by replacing a discredited deployment story with a plausible one closely linked to the experimental methodology, they hope to raise the research community's sights.
Mark Baker, Amy Apon, Clayton Ferner, and Jeff Brown
The Grid has evolved from a carefully configured infrastructure that supported limited Grand Challenge applications to a seamless and dynamic virtual environment being driven by international development and take-up. Commercial participation has accelerated development of software that supports Grid environments outside academic laboratories. This in turn has impacted both the Grid's architecture and the associated protocols and standards.
The recent adoption of Web services has produced a somewhat fragmented landscape for application developers. Developers currently face the dilemma of deciding which of the many frameworks and specifications to follow.
The Open Grid Services Architecture and the Web Services Resource Platform represent significant cooperation among researchers in academia, government, and industry. These joint efforts point to a promising future for the Grid regardless of the problems developers currently face.
Christoph L. Schuba, Jason Goldschmidt, Michael F. Speer, and Mohamed Hefeeda
Over the past several years, one successful solution for managing huge amounts of data on the Internet concentrates critical computing resources in Internet data centers. An IDC collects computing resources and typically houses them in one physical location: a room, a building floor, or an entire building. Large enterprises that rely heavily on the Internet and e-commerce applications typically operate their own IDCs, while smaller companies may lease computing resources within an IDC owned and operated by a service provider.
The NEon architecture, a novel approach for implementing the network services that IDCs provide, is a paradigm shift away from special-purpose network devices. By employing new flow-handling mechanisms to merge heterogeneous network services into one system, NEon offers an integrated approach to architecting, operating, and managing network services.
P. Oscar Boykin and Vwani P. Roychowdhury
The amount of unsolicited commercial e-mail—spam—has increased dramatically in the past few years. A recent study showed that 52 percent of e-mail users say spam has made them less trusting of e-mail, and 25 percent say that the volume of spam has reduced their e-mail use.
This crisis has prompted proposals for a broad spectrum of potential solutions. The objective of the various proposed legal and technical solutions is the same: to make sending spam unprofitable and thereby destroy the spammers' underlying business model.
Achieving these goals requires widespread deployment and use of antispam techniques. To gain user confidence, a prerequisite for wide deployment, the tool must be accurate, user friendly, and computationally efficient. The authors describe a technique, predicated on recognizing the unique characteristics inherent to social networks, that simultaneously achieves all these requirements.