Pages: pp. 10-13
"Switching on PoE," by Andy Dornan
The IEEE might be poised to achieve the long-term goal of Power over Ethernet (PoE) — making laptop power supplies unnecessary — in 2005. The organization began working on a new version of PoE in November 2004, targeting laptops and a new generation of power forwarding switches that receive power and then distribute it over the Ethernet. The goal is to have the new version, PoE Plus, support roughly 70 Watts of power, up from the 12.95 Watts that the existing 802.3af standard supports. The current wattage can power a phone or a Wi-Fi access point, but not something as large as a cash register or a long-range WiMAX transmitter, which could be served by the higher output.
"Which Wi-Fi?" by Andy Dornan
Hardware that incorporates the ever-expanding variety of 802.11 technologies should start cropping up over the next two years, making would-be buyers choose from a confusing collection of competing proprietary approaches. To help these buyers, Dornan outlines today's best predictions for how Wi-Fi will work in the future. He covers five basic areas: access points (APs), location-aware networks, interoperability, antennas, and the Ethernet.
APs of the future will be able to take independent action when necessary, Dornan says, and 802.11 technologies will have to be embedded into both APs and central devices. He suggests that radio frequency fingerprinting — which compares signals against a network's preexisting radio model — will likely surpass triangulation as the most effective way to locate users on a network. Public and private networks probably won't achieve interoperability in the next two years because cell phone carriers have little incentive to support Wi-Fi roaming, and the Wi-Fi industry hasn't even agreed on a way for clients to move between proprietary APs, let alone from Wi-Fi to cellular. Combining more radio spectrum and more antennas is the most likely scenario for achieving the higher TCP/IP throughputs of 802.11n because each has critical advantages. Ultimately, Wi-Fi vendors' long-term goal of eliminating Ethernet entirely will require devices that support other standards, including 3G and possibly WiMAX, and that automatically select the best technology based on the communication desired.
"Wireless on Wheels: Carmakers Are Taking Telematics to the Streets," by Stacy Lawrence
Automakers are increasingly adding a variety of wireless communications and computing options, known collectively as telematics, to their products. As a result, the number of telematics subscribers in North America is predicted to increase five-fold over the next five years. To combat expected competition from mobile phones, some automakers plan to outfit their cars with Bluetooth docking stations or wireless connections that can network with the phones. The number of handsets using Bluetooth jumped 65 percent in 2004 to about one-third of all handsets, whereas the number of Bluetooth-enabled vehicle models increased by 40 percent. ABI Research is predicting that 22 million vehicles, or one-fifth of all vehicles manufactured, will have factory-installed Bluetooth hardware by 2008.
14 December 2004
"Is P2P File Sharing Fading?"by Sebastian Rupley
Although most industry sources concede that peer-to-peer file sharing has slowed significantly thanks to recording industry lawsuits, a new study argues that P2P file sharing hasn't actually slowed, but instead has gone into hiding. The paper, "Is P2P Dying or Just Hiding?" ( www.caida.org/outreach/papers/2004/p2p-dying/), was written by researchers at the University of California, Riverside, and the Cooperative Association for Internet Data Analysis, based at the University of San Diego's Supercomputer Center. It claims that P2P file-sharing traffic hasn't dropped, and that previous studies have focused only on slowdowns at popular file-sharing services such as Kazaa without analyzing the ways in which P2P traffic is being camouflaged.
Dr. Dobb's Journal
"Adding Voice to XHTML," by Gerald McCobb and Jeff Kusnitz
In the future, developers hope to have standards that will let various modes of communication work across a variety of appliances. Such multimodal browsers could let a person dial into a portal using his or her phone, inquire about flights from Atlanta to San Francisco, and see those flights listed on the phone's display. The person could then use either his or her voice or a stylus to have the system read additional information over the phone. IBM, Motorola, and Opera ASA have jointly created a multimodal markup language, XHTML+Voice (X+V), that allows for such a scenario, adding speech recognition to more common forms of user interaction such as typing and tapping. The authors outline the basics of X+V, which is built on the latest recommendations from the W3C for visual interaction, authoring event listeners and handlers, and voice interaction.
"Firefox Paws at IE," by Andrew Conry-Murray
The Firefox open-source browser has recently enjoyed remarkable popularity among Internet users, skyrocketing from 8 million downloads in December to more than 22 million by 1 February. However, few corporate customers are among the faithful. The Mozilla Foundation, which is behind Firefox's development, says the browser is making inroads among universities such as Yale, MIT, and Boston University, as well as some small, tech-savvy businesses. However, experts point out that larger enterprises are traditionally late adopters. Still, although superior security is one of Firefox's best selling points, it might not hold up for long: as Firefox gets more popular, hackers will be more likely to begin devoting attention to finding its vulnerabilities.
"Microsoft Declares War on Spam: The Once Insular Superpower Is Enlisting the Help of Allies," by Robert Buderi
A little-known group of spam fighters at Microsoft — the Safety, Technology, and Strategy Group — has already helped bring lawsuits against about 100 spammers and developed several successful email-filtering technologies. Now, Microsoft is launching two new technologies from the group — which has also reached out to ISPs such as AOL, Yahoo, and Earthlink — to help set standards, draft legislation, and educate consumers to help fight spam. The first new technology, Sender ID, aims to thwart spoofing and phishing by comparing the transmitting server's address against those machines authorized to handle the sender's email. The other technology, called computational proof, is a more generic tool that outfits email programs with software that forces any computer sending a message to spend a few seconds working out a small puzzle before the email is accepted. Computational proof shouldn't affect servers sending out normal volumes of email, but it's designed to significantly slow down spam servers that spit out blizzards of messages daily.
Visual Studio Magazine
"Debug Partially Trusted .NET Apps," by Francesco Balena and Enrico Sabbadin
One of the .NET Framework's more valuable features, not widely known among developers, is its ability to run Windows Forms applications from the Internet or an intranet. However, doing so can be challenging because, like any executable launched from somewhere other than local disks, the applications are partially trusted code by default, which allows them only some of the permissions granted to standard applications. Balena and Sabbadin detail a few ways to debug such partially trusted applications, including simply changing the project's output path to point to a network share. They also explain how to embed resources into the assembly and how to use regular expressions to filter out duplications in a document.
"Yahoo Tackles Email Forgery,"by Andrew Conry-Murray
Because email authentication via digital signatures is widely seen as the best solution for thwarting email forgeries, the IETF has formed a fledgling group, Message Authentication Signature Standards (MASS), to generate standards for digital signatures. The leading proposal is Yahoo's DomainKeys, which uses public–private key pairs to verify email domains: a domain owner generates the key pairs using software such as OpenSSL, storing the public key in DNS and the private key where outgoing mail can access it. MASS is also considering a plan from Cisco Systems that creates a separate server to authorize signing keys. (In January, after this article was written, Yahoo started signing all outgoing Yahoo Web mail using DomainKeys, giving it a possible leg-up over the competition.)
14 December 2004
"Bots March In," by Jay Munro
The number of "bot" worms, commonly used to create armies of so-called zombie computers to do their dirty work, increased 600 percent in the first six months of 2004, according to Symantec's Internet Security Threat Report. This made bots — which are small scripts designed to automatically perform some often harmless function — the second most common attack threat during that period. In particular, bots' ability to sniff out network packets has many worried that hackers will use those packets to get secure information such as financial data or passwords.
8 February 2005
"The Virus Wars — In Your Palm," by Sebastian Rupley
So far, according to an official at Trend Micro Mobile Security, most threats to mobile appliances and smartphones have involved proof-of-concept attacks, such as the Skull.a and Cabir.a viruses from 2004. However, analysts and security software companies are warning that smartphones and other mobile appliances will become major targets for security threats in 2005. This prediction is even more dire due to a Gartner report that claims 90 percent of mobile devices given out by IT departments contain no security protection.
C/C++ User's Journal
"SOA is not SOAP," by David Houlding
To cut through the confusion and hype associated with the swift growth of service-oriented architectures (SOAs), which parallels the equally fast-growing area of Web services and SOAP, Houlding outlines some of the key aspects of SOA and explains how SOAP is just one middleware used to access services. In fact, he suggests that developers can't know all of an SOA's potential middleware requirements when it's implemented, and that the true benefits won't be realized until developers accept that they'll always need different middleware to access different services.
Dr. Dobb's Journal
"SOAs and ESBs: Ensuring that Your ESB Delivers an SOA,"by James Pasley
Pasly contends that an enterprise service bus (ESB) — a collection of servers and components that provide tools to help convert existing systems into services — gives developers all the infrastructure and tools required to build an SOA. However, he says that when dealing with the sheer number of such tools, developers sometimes lose sight of the SOA while working on the ESB. Pasely provides an overview of what an SOA is, describes an ESB's role in creating it, and outlines challenges that developers can expect to run into while deploying an SOA. The latter include laying out the existing system's core services, creating business services from those core services, and using those business services to create business process modeling.
"Web Services' Minority Report,"by David Greenfield
SOAs are generally considered the future of enterprise networks; their benefits and those of Web services, in terms of simplifying business processes, are too significant to ignore thanks to XML and SOAP messaging. However, because today's SOAs consume 30 to 50 times more bandwidth than other options, network administrators are bracing for the worst. Greenfield tries to help those administrators by walking them through the various network architectures in the marketplace. He also presents a checklist to help developers deploy their XML networks as effectively as possible by focusing on the XML function being treated — acceleration, security, or routing.
"Digital Rights Technology Sparks Interoperability Concerns," by David Geer
The creators of digital media work hard to protect their content from copying or illegal use. As a result, companies have produced digital rights management (DRM) technology to enforce licensing limits. However, most of these technologies are proprietary, making interoperability difficult at best. Several initiatives have popped up to try to address this challenge. RealNetworks, for instance, has created some interoperability with one of its DRM systems. A consortium called Coral, which includes Hewlett-Packard, InterTrust Technologies, Matsushita Electric Industrial, Philips Electronics, Samsung Electronics, Sony, and Twentieth Century Fox, is also involved. Coral is considering using InterTrust's networked environment for media orchestration (NEMO) technology, which works through a software-based service-oriented architecture to let different DRM systems communicate.
IEEE Intelligent Systems
"Collaborative Filtering with Maximum Entropy," by Dmitry Pavlov et al.
Recommender systems — those that try to recreate word-of-mouth phenomena for things such as e-commerce and search engines — have become integral to the online world. However, these systems can be challenged by the very nature of the high-volume, fast-changing environment they're trying to describe, where users come and go and alter their preferences and goals frequently. The authors outline one possible solution: a maximum entropy algorithm that they say is compact, offers quick model querying, and generally generates more accurate results than its competitors. In addition, they offer a document-clustering approach to help train the model faster.
"That Obscure Object of Desire: Multimedia Metadata on the Web, Part 2," by Frank Nack, Jacco van Ossenbruggen, and Lynda Hardman
This conclusion to a two-part series explores the next phase in multimedia development: finding a way to put multimedia metadata on the Web to allow for Web-based multimedia searches. This installment analyzes the differences and similarities between the two existing approaches to creating content description that's both semantic-based and machine-processable. Those approaches — the Semantic Web and the Multimedia Content Description Interface (MPEG-7) — are both XML-based, but they can't interoperate, and the authors conclude that neither alone is adequate for a media-aware Semantic Web.
"Identity Resolution in a Global Environment," by Heather McCallum-Bayliss
Identity resolution — correctly identifying individuals with similar names or other characteristics — is becoming increasingly important in today's online world, but it's extraordinarily difficult for computers to do it well, for several reasons: such automated systems generate probabilities, not certainties; personal information is dynamic; human error can't be eliminated from the gathering process; and identity markers vary widely across cultures. McCallum-Bayliss calls for someone to develop a flexible, fast system that incorporates cultural differences, which she contends doesn't currently exist.
IEEE Security & Privacy
"Risk-Based Systems Security Engineering: Stopping Attacks with Intention," by Shelby Evans et al.
The US National Security Agency has developed a methodology that can help systems security engineers understand which real-world attacks are the most serious and likely, letting them make more informed decisions about trade-offs between functionality and security. The authors outline NSA's Mission-Oriented Risk and Design Analysis (Morda), which generates unbiased risk metrics using information on threats, attacks, and mission impact, and analyzing information systems through an adversary's eyes. They recommend systems security engineers use Morda rather than the "combination of guesswork, blind adherence to outdated policies, and — as vulnerabilities are discovered — after-the-fact reactions" that too often form the basis of such decisions.
"Web Metadata Standards: Observations and Prescriptions," by David Bodoff, Patrick C.K. Hung, and Mordechai Ben-Menachem
The argument for content metadata standards is simple: When a Web page's information is formalized and unambiguous, software can take that metadata and offer users services such as comparison shopping or areas of interest. However, the authors say that they're concerned about the already large number of existing standards for content metadata. They're also worried that the standards' creators might have overlooked lessons learned in other disciplines in their haste to deliver promised functionalities. The authors review several existing standards for content metadata, pointing out areas in which their developers might not have applied lessons that were hard-earned in areas such as software engineering, library science, and artificial intelligence.