Pages: pp. 110-114
Considered over the spectrum of possibilities, computer science research funding worldwide appears to be in overall good health, with appropriations either holding steady or increasing slightly. However, while certain nations are emphasizing more research and providing new funds to encourage scientists to follow their inspiration, others, including the US and the European Union, are entering new funding rounds with what researchers say are insufficient resources to assure a steady pipeline of innovation in the future.
"Our Nation is faced today with a number of serious issues that are putting tremendous pressure on the government's budget, but we must remember that sufficient investment in the long-term future of this country is essential," says Peter Freeman, assistant director of the US National Science Foundation's Directorate for Computer and Information Science and Engineering.
The funding discussion is entering a controversial period. Many scientists are critical of what they call a new shorter-term focus, as well as reduced funding, at the US Defense Advanced Research Projects Agency. However, DARPA Director Anthony Tether has characterized mainstream media reports that claim the agency is reducing funding and discouraging basic research as incorrect. In testimony in front of the US House of Representatives Science Committee ( www.house.gov/science/hearings/full05/may12/index.htm), Tether said the nature of research itself has changed, and all disciplines must learn to live in this new era.
"The major complaint about reduced university funding seems to be coming from one discipline—computer science," Tether testified. "But if DARPA's overall funding of universities is more or less constant, then other disciplines must be the recipients of DARPA's research funding. The key question becomes, 'What other discipline has grown significantly over the past five years at the expense of computer science?' As part of my investigation, I reviewed several dozen university Web sites to see if I could determine the new discipline that was on the rise.
"The answer is surprising on the one hand and obvious on the other: no single discipline has been taking over. Every university Web site I visited advertised that they had created centers for multidisciplinary research and professed that these centers were the harbinger of the future. What must be happening is that, while computer science is always part of the multidisciplinary efforts, its past dominance and, hence, respective share must be decreasing in relationship to the other disciplines involved in the effort."
In Europe, too, officials say scientists are complaining of budget targets unmet due to monetary crises and an increased emphasis on other disciplines.
Khalil Rouhana, head of the European Commission's Strategy for ICT (Information and Communication Technology) Research and Development unit, says computer scientists on the continent have also complained about getting short shrift in funding the past several years.
CS, Rouhana says, "was very high on the research agenda in Europe in the early and mid 1990s. There was a big community in formal methods, process, etc. There has been complaining since the beginning of this decade that this area is now lower than the other priorities. We're trying to see whether this is true."
The NSF's Freeman says that although multidisciplinary efforts to address important problems such as healthcare or crisis response are clearly needed, doing so at the expense of basic science is a mistake. An example, he says, is the steep rise in funding for research at the US National Institutes of Health, which saw its budget double from the late 1990s through 2003.
"You can't do many of the things NIH wants to do without the basic science that we support," Freeman says. "They're all focused on clinical science. We do much of the basic biology, we and DOE [US Dept. of Energy] do all the basic physics and basic chemistry, and in computer science, we're almost the only game in town."
William Scherlis, director of Carnegie Mellon University's software engineering PhD program, echoes Freeman's concerns.
"Among engineering building materials, software is uniquely unbounded and flexible," Scherlis says. "It can scale in complexity, capability, and size to an extent limited only by our human intellectual capacity. For this reason, software capability is a strategic source of market differentiation in many industries, ranging from financial services and healthcare to telecommunications and entertainment. For the same reasons, the dependence on software in infrastructural and national security systems is pervasive and increasing."
Scherlis also says a historical perspective of SE research demonstrates the importance of public funding.
"Despite a perception that 'COTS will solve it,' it is a safe claim that the most important fundamental breakthroughs in software engineering practice have originated in government-funded research projects."
The reason, he says, is that many of the more important breakthroughs have a nonappropriable character; that is, the intellectual property created can't easily be protected from commercial competitors. The responsibility to make these broadly valuable "raising the playing field" investments generally resides with government.
"This is not unusual—it is the same economic reality that drives the growing multi-billion-dollar government investment in bioscience, healthcare, and other strategically important basic scientific disciplines."
The proposed fiscal-year 2006 budget for the US Networking and Information Technology Research and Development program is US$2.15 billion, down slightly from last year's $2.2 billion ( www.aaas.org/spp/rd/06pch23.htm). This essentially flat curve is fairly consistent throughout the post-World War II period, except for the large spike in funding following the Soviet Union's launch of Sputnik in 1957 and during the Apollo space program's early years.
Of course, compared to the rest of the world, the US still spends enormous amounts on R&D. According to 2004 data collected by the Computer Research Association ( www.cra.org/info/research/overall.html#perworld), the US accounts for 37 percent of the world's $746.7 billion R&D expenditures, more than its next-closest competitors, Japan (14 percent) and China (10 percent), combined.
For funding measured as a percentage of GDP, the US is also among the global leaders. According to the Organization for Economic Cooperation and Development, the US spends 2.82 percent of its GDP on R&D. Only a handful of nations—Sweden, Finland, Iceland, Japan, and Korea—exceed that percentage ( www.oecd.org/dataoecd/49/45/24236156.pdf).
"I always get the sense that there's a lot of money to go around in the US," says Mark Keane, director of the Information and Communications Technology Division of Science Foundation Ireland, that nation's research agency. Instead of using raw dollars or percentage of GDP, Keane muses that perhaps calculating R&D expenditures per capita might be a more salient way of determining a nation's emphasis on research. For example, he says, SFI, which was chartered in 2000 with a €650 million budget, has spent about €250 million thus far, or about €62.50 per Irish citizen, while a French plan to spend some €200 million annually would work out to just €3 per French citizen.
Of that €250 million, Keane estimates SFI has spent about €30 million on software engineering—"I think that's way enough to get leverage in some particular area," he says.
Of particular interest are the close ties between the US and Irish scientific communities and Ireland's adoption of several features of the US research infrastructure. In fact, SFI's director general, William Harris, is a former director of the NSF's Directorate of Mathematical and Physical Sciences.
"The [Irish] government sort of targeted, went out, and headhunted in a very aggressive way," Keane says. "When Bill Harris was appointed, the second-most important politician in Ireland went and stood in his office and said, 'We want you to do this.' If you walk down the corridor there are a lot of American accents."
Importing an experienced American staff helped SFI ramp up its agenda quickly, Keane says. The agency didn't just import and apply the NSF model; it adapted many NSF principles to an Irish context, including the establishment of research centers. Keane says a proposed software center, which will study software design in automotive, telecommunications, and medical device applications, is scheduled to be funded shortly pending board approval. Coincidentally, the center appears to be following the multidisciplinary philosophy to which DARPA's Tether testified before the House Science Committee.
"I thought that was probably good strategy, because what they were doing was working in an application domain to generate some generic solutions, and then look at the application across to a different domain," Keane says.
Canada, too, is placing new emphasis on multidisciplinary "big science" approaches. In fact, its main funding agency for science and engineering research, the Natural Sciences and Engineering Research Council ( www.nserc-crsng.gc.ca), is evaluating proposed methodologies for evaluating, funding, and overseeing these multi-institution projects, in which SE research could play a large role.
"It could [play a large role], in terms of software engineering being at some point an essential contributing factor in tackling some huge projects like genomics or the whole area of bioinformatics," says Isabelle Blain, NSERC's vice president for research grants and scholarships. "It's at a stage now where biologists can't do it alone. So, if in order to address an issue that requires hundreds of millions of dollars, you need biologists and computer scientists and software engineers, then so be it. That's part of the rationale for invoking the big-science framework."
In addition to the codification of multidisciplinary procedures, discrete CS and SE research has also fared well in Canada. Blain says software research funding has gone from about Can$3.6 million in 2000–2001 to $5.56 million annually in 2004–2005. SE and CS research resources have grown 60 to 70 percent compared to 30 percent average growth across all disciplines over the same time period. Blain says the new resources are not only attracting resident researchers but also bringing expatriate Canadian researchers back home and tempting international researchers to relocate.
The EU is also trying to give researchers a large increase in resources. Rouhana says the preliminary proposal for the EU's 7th Framework Program, a multiyear research agenda slated to begin in 2007, nearly doubles annual ICT funding, from around €950 million to €1.82 billion. The proposal, he says, is the first real increase in research funding since the 3rd Framework Program of 1990–94. However, the proposal is a pawn in an argument between the European Commission and member states, which want to cap national contributions to the EU. Researchers throughout the continent have been campaigning for the full proposal ( www.initiative-science-europe.org/Appeal_Final.pdf), citing low R&D figures compared to the US. Rouhana says European software research especially might indeed need more public support to bolster competitiveness.
"Our packaged-software industry is very weak, aside from SAP [a software company], so the private investment in research in software remains weak in comparison to the US and Japan. Investment in our vertical industries, many of which are increasingly dependent on reliable software, however, is very strong. But will they do the theoretical computer science research on software? I'm not sure."
While scientists are trained to debate arguments on their merits, public funding on such a large scale as a national research budget depends far more on political factors and external triggers. The steep rise in the US research budget after Sputnik's launch, for instance, was a reaction to fear that the Soviet Union was outpacing the US in scientific capability. More recently, Japan's unveiling of the Earth Simulator supercomputer led to calls for more high-end computing research in the US. However, a recent study by the World Technology Evaluation Center concluded that, overall, the US remained ahead of Japan in high-end computing even before IBM's Blue Gene computers displaced the Earth Simulator as the world's fastest ( www.wtec.org/hec/report/hec-report.pdf).
China is portrayed as perhaps the next great strategic challenger to the US. Its R&D funding has risen quickly to 1.3 percent of its GDP, and it's encouraging more cross-pollination of knowledge between state-owned industries and labs and private firms.
"I think the Chinese realized their old system of the firewall between military and civilian uses wasn't good for either," says University of Oregon professor Richard P. Suttmeier, an expert on Asian nations' technology policy. Suttmeier says the Chinese government began encouraging more dual-use research in the 1980s, and while this process has been slow, the boundaries have become much more permeable. In fact, Suttmeier says many of the privately held companies, those most involved in ramping up to compete globally, might be offering the nation's strategists the most innovative ideas.
However, whether the Chinese ascendancy represents a Sputnik-like entree for computer scientists to prove a need for greater funding is an open debate. Suttmeier estimates the Chinese basic research budget is only about five percent of total R&D, with far more emphasis on end-stage product development. He was echoed by William J. Perry, senior fellow at the Stanford Institute for International Studies and a former US Undersecretary of Defense for Research and Engineering. In April 2005 testimony to the US-China Economic Security and Review Commission, Perry said he thought China lacked the resources to challenge the US in basic research ( www.uscc.gov/hearings/2005hearings/transcripts/05_04_21_22.pdf).
"What I see from China today is product development," Perry told the commissioners. "I have looked carefully in my various visits to China for examples of technology-based development. I don't find it. I don't think they have it. So I think they're deficient in two respects. First of all, they lack the technology base we have. And, secondly, they do not have the culture that supports the innovation that we have. I think that's going to be a fundamental problem that is going to hold them back. So if I'm right in that, they're always going to be in what I call a tail chase on the new products."
However, another technology policy expert, Michael Pillsbury, senior fellow at the Atlantic Council of the United States, testified that the current prevailing sentiment—China doesn't have the capacity to provide a "Sputnik kind of shock"—might prove disastrous in the long run.
Citing a statement from a science advisory issued by President Bush, Pillsbury said, "He puts up figures of what caused the rise and fall of US science budget funding over a 50-year period, [and] he makes the point that it's external factors outside the community of scientists that determine budget increases and improvements in our ecosystem of scientific innovation. We need challenges from the outside. We need a tail chaser who is ruthless, closing in on us, and who threatens to surpass us. And according to the old paradigm, we certainly don't have that in China."
Whatever the motivator might be for a concerted campaign for more funding, the NSF's Freeman says it's time for the scientific community to reach a wider audience in describing the true severity of the challenges ahead.
"We are concerned, and do need to help the general public understand better the role of basic research and how that ultimately leads to the wonderful gee-whiz products people like to buy and build," he says. "I think most people sort of think those developments just spring full blown from Microsoft or Intel or whomever without any fundamental research. Most people don't understand that long process.
"Do we need a Sputnik to get us going? I'm not a historian. … I think perhaps some people would agree with the hypothesis that Americans are great at responding to crises, but whether that's what's needed or not, I wouldn't say."
Unfortunately, only hindsight will be able to tell policy makers and scientists if a reduced emphasis on computer science research will produce a disadvantage that takes decades to reverse. Twilight is beautiful to the eye but signals the beginning of darkness. If the US is on the verge of that twilight, the nation might not realize it until it is too late.
Computer security lapses by banks and credit card companies get the headlines, but they're barely the tip of the iceberg. Other corporate files, as well as governmental data, are at risk from hackers and others who want to buy or steal private data. The managers charged with preventing security breaches face huge challenges, but signs of help are appearing on the horizon. Standards bodies are moving forward with documents that provide the basis for protecting data from unauthorized outsiders, and corporations seem to be endorsing the efforts.
A growing number of companies have begun adopting the ISO 17799 security standard, also known as BS 7799. Originally developed in the UK, it saw little acceptance until the ISO adopted it, published it in 2000, and revised it in 2002. The US National Institute for Standards and Technology has completed an 88-page document, Special Publication 800-53, that will be the basis of federal requirements set to begin in December 2005.
At the same time, the IEEE, working in conjunction with NIST, is finalizing the Standard Security Architecture for Certification and Accreditation of Information Systems, more commonly known as P1700. That document should go to ballot by mid 2006.
Companies realize that the need exists for common approaches. "Security is too complicated not to use some standards. A standard gives people a recipe for best practices," says Jeffrey Voas, director of systems assurance at Science Applications International.
He notes that one list of top computer security risks has 300 items that must be checked. "That's a heck of a recipe if you have to check every ingredient," Voas says.
Another key benefit of standards is that they provide common ground for discussions. "One of the challenges is that everyone has their own lingo, so it's challenging to work together without common terms," says Willibert Fabritius, lead auditor at TUV Rheinland of North America's Northbrook, Ill., office.
That will make it easier for companies to compare notes. For example, companies can check each other's standards to confirm the security levels before signing contracts.
"If you have two companies that want to work together, this provides a way to help them decide if they're taking reasonable risk. If things aren't codified, there are a lot of varieties. This gives them a leg up; there are specific components to check," says Jack Cole, chairman of the IEEE P1700 Working Group.
Fabritius expects that, as the standards gain acceptance and more tools become available, insurance companies will use adherence to standards to help set rates for cyber insurance. It's also likely that companies will begin requiring that service providers adhere to the specifications. That's already set to happen with US government agencies, which must implement security measures mandated by the Federal Information and Security Management Act of 2002.
"If you want to deal with the federal government, you have to demonstrate that you're addressing at least some subset of security standards," Voas says.
While US efforts are still in the early stages, some observers say that companies in other regions have focused more on security. "The idea of companies certifying their management systems is triggered by perceived needs and costs. Societies in Europe and Asia think more long term, so we're seeing tremendous growth there," Fabritius says.
While there's a strong push for standardization, observers note that these specifications won't provide the same level of detail that many technical specifications provide. Most security-related standards provide guidelines that let system developers create different levels of security for different applications.
"We're not writing anything highly specific; there are a lot of variations," Cole says.
This could help overcome one issue raised by those concerned about the possible negative consequences of standardizing security architectures. "One problem with standards is that a smart person can figure out where the holes in the standard are," Voas says.
But as in other fields, patches and suggested workarounds are expected to emerge quickly. And others note that the many variations the guidelines offer will also add a layer of complexity that will help slow down hackers, pirates, corporate spies, and others.
These standards also address some nontechnical security issues. For example, Fabritius recalls a visit to a company that had "excellent protection" for its secure server room. "They had invested thousands to prevent access to the server room, but it had a window that wasn't even locked," he says.