MARCH/APRIL 2006 (Vol. 10, No. 2) pp. 9-15
1089-7801/06/$31.00 © 2006 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
Networking Industry Has Plenty of Fruit, but Is the Trunk Rotting?
|Stifled Competition and Innovation|
PDFs Require Adobe Acrobat
Networking industry veterans and analysts worldwide are beginning to mull in earnest the effects of market forces such as telecommunications provider consolidation, regulatory actions, and the migration of intellectual talent away from corporations to nimble startups. In December 2005, Alcatel, the venerable Paris-based networking equipment vendor, announced it was purchasing a 25 percent stake in 2Wire. The successful eight-year-old San Jose, Calif.-based provider of residential and small office/home office (SOHO) broadband gateways also attracted minority ownership stakes from AT&T and Telmex at the same time. Just a few weeks later, France Telecom, the largest telecommunications service provider in France, issued a profit warning that saw its stock tumble 10 percent in a week, as legally mandated local-loop sharing caused the company to lose 10,000 subscribers a week to voice-over-IP (VoIP) competition.
For a fortunate player such as 2Wire, the market trends are proving to be a boon. For some well-established players, such as France Telecom, public policy and market forces have been anything but kind. Digging beyond the surface story of winners and losers, however, long-term implications loom increasingly large: how must incumbent core technologies be cultivated for the network's edge to flourish?
At heart, these developments might be demonstrating that a blanket "open access" approach, in which incumbent core providers are forced to allow competitors access to their infrastructures without adequate assurance that their existing investment will be protected, could affect the core providers' ability to ensure that an innovative and secure backbone exists; on the other hand, a blanket laissez-faire approach, in which the market theoretically determines which companies and access methods emerge without government encouragement of new infrastructure investment, might reduce the core providers' motivation to invest in new core technologies, such as higher-speed routers or advanced modulation techniques, as the excitement and investment money move quickly toward the networking industry's edge. Additionally, innovation, health and diversity are profoundly affected by how many potential access channels consumers have to choose from. Until now, maintaining a healthy balance between public and private investment, and prioritizing those investments wisely, have been elusive goals.
Stifled Competition and Innovation
Bruce Page, Paris-based vice president for telecommunications services research at Current Analysis, says the exodus of customers from France Telecom shows that the law of unintended consequences has indeed struck the French market. "In a single day, France Telecom's stock fell by 8 percent. I think their market cap was somewhere up in the [US]$40 billion or $50 billion range, so there's a $5 billion hit or so. That's real money. The regulators in France were not targeting that kind of an impact when they said, 'Thou shalt unbundle the local loop.'"
Conversely, in the US, recent Federal Communications Commission (FCC) decisions have essentially left the consumer edge of the network a duopoly in which incumbent cable and telephone providers are competing head-to-head. Claiming that public funding of broadband networks unfairly competes with free enterprise, incumbents have strongly lobbied against insurgent attempts to enter the access market through initiatives such as municipal wireless or fiber-to-the-home. Vint Cerf, Google's chief Internet evangelist and one of the Internet's founding fathers, says the success of such lobbying inhibits innovation and hampers the US's global competitiveness — especially with the telecommunications industry consolidating further, as evidenced by the recent purchases of AT&T and MCI by SBC and Verizon, respectively.
"It sounds beguilingly attractive on the surface, and yet the side effect of all that is to inhibit innovation — and it's terrifying because we lived through a period of very poor innovation in the telecommunications industry until the AT&T system was broken up," Cerf says. "This is a very peculiar thing, because AT&T did this spectacular scientific research and turned up some pretty amazing things, and yet was not necessarily very innovative in its services."
Brion Feinberg, vice president of systems engineering at New Jersey-based Sereniti, a developer of home networking management technologies, began his career at one of the previous era's "motherships" of invention, AT&T's (and later, Lucent Technologies') Bell Labs. On one hand, Feinberg says, the past decade's economic dynamics have led to exciting new opportunities for technologies such as his company's; the gestation model has changed from in-house to an external model funded by venture capitalists, as evidenced by the Alcatel/AT&T/Telmex investment in 2Wire and Cisco's recent acquisition of set-top box maker Scientific Atlanta.
"We actually used to talk about this back when Lucent was worried about Cisco as a competitor," Feinberg says. "Was that a better model? Because they had much less invested in the research, they had to pay a premium for the ones that hit, but they didn't have to pay for the ideas that went nowhere."
On the other hand, with so much emphasis placed on value-added edge technologies, and the coincident shrinking of profit margins for carriers' traditional core services, Feinberg and others believe some fundamental assurances have gone begging.
"At AT&T, we broke our backs to make sure this stuff always, always worked," Feinberg says. "If the whole world moves to Skype, what happens if there's a massive failure of the Skype network? We had government regulations that said we had to guarantee it was never going to happen, and it still happened. Networks fail in clever and horrible ways."
Cerf says trying to reduce those "clever and horrible" failures should be a priority. "The existing Internet environment is quite risky. We have a lot of significant challenges in the use of this system, and the more we depend on it, the more risky and fragile it might prove to be. So I think there's some serious work to be done." However, though the US National Science Foundation has launched an initiative examining potential new backbone architectures, Cerf isn't optimistic about vital basic research becoming a priority any time soon.
"Here we are, with an industry that appears to be increasingly unable to support long-range and risky research and a government that isn't as inclined as it once was to undertake this risky research. We're now back to an interesting conundrum," he says. "The federal government, to the extent that it is persuaded that research in these areas is important for national interests, really bears an important burden to put money in that direction, to put people in the positions to recognize the importance of new ideas, and be prepared to back them up."
Cerf says the US government's deregulation has led to a vexing situation in which private sector carriers aren't providing core innovations, and a government dedicated to private enterprise has slackened its pace in advancing new technologies that feed innovation.
Jay Pultz, vice president and distinguished analyst at research firm Gartner Group, as well as a Bell Labs alumnus, uses an agricultural metaphor in describing the current state of core network technology.
"We're probably in more of a harvest mode, now and in the next couple of years, in terms of technologies in the backbone," Pultz says. "The next things that are going to happen in the backbone are things like IPv6 [Internet Protocol version 6]. There's not a lot of new work going on in transmission at the backbone — mainly because fiber technology is very highly ahead of demand right now. I knew a lot of startups in the late 1990s that were working on more efficient ways to move data through the backbone. We could even pack a lot more bits on a fiber pair if we wanted, but since there's so much fiber in the ground, no one wants to.
"So, a lot of the research on things like advanced modulation techniques and things people were very excited about in the late 1990s has pretty much dried up."
Peter Carbone, chief architect and acting chief technology officer at Canada-based Nortel Networks, says technologies such as wireless broadband, including the nascent IEEE 802.16 (WiMax), will change the equation for access points, but policymakers must be cognizant of economic realities.
"You have to make sure the infrastructure providers, whoever they are, can manage to put together a profitable business case and continue to make the investment so the broadband gets broader and keeps current. Otherwise, it will go into a state of decay.
"The traditional suppliers are no longer the gatekeepers," he says. "They have to deliver value or they'll be bypassed. There is now more wireless community out there than fixed, and that's going broadband, all of it. Then you add WiMax. If you're in a jurisdiction in which it can be deployed unlicensed, it offers a really interesting overlay capability that forces those big incumbents to become much more competitive."
Carbone argues that the technology base to concentrate on is in Web services and service-oriented architectures, rather than whether end users get their Internet access via digital subscriber line access multiplexers (DSLAMs), wireless interfaces, or cable modems, and so on, which he believes the market will sort out given a nurturing environment for carrier competition.
"This [competition] is actually going to enrich the environment if you get it right," he says. "You have to get it right at the right time. If you're too far ahead or behind it's very, very challenging. That's the big question in the industry right now."