Issue No.06 - June (2012 vol.45)
Published by the IEEE Computer Society
Carl Chang , Iowa State University
Vladimir Getov , University of Westminster, London
Kelvin Sung , University of Washington Bothell
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MC.2012.203
The recent phenomenal growth in research activities has provided a favorable environment for computing advances in Asia.
Computing in Asia has come a long way. However, while some well-developed nations such as Japan have a higher concentration of IT installations and skilled IT workers, such capabilities are not yet evenly distributed in developing countries. Just as Asia is characterized as having diverse people, languages, and cultures, its computing endeavors are also diverse, covering a wide spectrum of research and development activities that stem from regional economic concerns and priorities.
Computing as an Economic Factor
To some extent, because of the relative immaturity of their fixed IT facilities, Asia-Pacific economies might be well-positioned to take on the emerging challenges associated with moving to the next big thing in computing—the cloud computing paradigm. 1 A good analogy is the rapid buildup of high-speed rail in China: beyond financial concerns and policy controversies, there seems to be ample opportunities for thinking clearly and acting quickly to solve related problems. However, poor alignment of national legal and regulatory policies could cast a shadow on Asia-Pacific regions where computing serves as an economic catalyst.
Certainly, it is well known that China has become the largest manufacturer of electronic devices, including notebook computers, iPads, and tablet PCs. Many examples confirm recent advances in the Asian computing industry. For example, the phenomenal growth in genomic research at the Beijing Genomics Institute exemplifies the favorable environment for computing in Asia, including lower costs and abundant human resources, compared with developed countries such as the US. As another example, according to the November 2011 TOP500 list, the world's two fastest supercomputers are the Japanese K computer and the Chinese Tianhe-1A ( www.top500.org/lists/2011/11). It is important to emphasize that these are not isolated or random examples of success. Based on unprecedented achievements in the computing sector, both China and Japan continue to increase not only the number of their supercomputing systems on the TOP500 list but also their aggregate computing power.
In This Issue
The contributions from prominent research centers included in this theme issue provide an overview of computing advances in several regions in Asia.
In "Internetware: A Software Paradigm for Internet Computing," Hong Mei, Gang Huang, and Tao Xie from Peking University discuss the need for a new Internet computing paradigm and summarize the results of substantial research and development efforts by the Chinese software community in this area. Due to the global network environment's open, dynamic, and ever-changing nature, software on the Internet must demonstrate features that make it autonomous, cooperative, situational, evolvable, emergent, and trustworthy. Designing and supporting these features presents several challenges for developers of software technologies.
Sponsored by a national research program, researchers in China have proposed addressing these challenges by introducing the Internetware software paradigm. The authors describe Internetware's main characteristics and key technologies and outline the corresponding infrastructure support. They present this paradigm from four important aspects: the software model (what Internetware is), the runtime platform (how it operates), the engineering approach (how to design and develop it), and quality assurance (how well it all works). The article summarizes the progress and status of Internetware research and predicts future trends in this exciting new area.
"Computing for the Next-Generation Automobile," by Mikio Aoyama of Japan's Nanzan University, begins with a description of how crowdsourcing using in-car navigation systems assisted in the mapping of passable operational routes after the devastating earthquake and tsunami on 11 March 2011. Having established the potential and importance of computing resources in automobiles, the author discusses the state of three important areas of automotive computing: making vehicles greener, making them smarter, and merging transportation and information networks.
On the environmental front, the article surveys the current state of hybrid, plug-in hybrid, and electric vehicles, outlining the architecture and computing requirements for controlling and coordinating electrical and mechanical systems. The author frames the smarter vehicle discussion based on two safety modes: passive (mitigating the damage incurred in an accident) and active (intending to avoid an accident). The merging of traffic and information networks to create the ultimate cloud-based intelligent transport system is a work in progress with the goal of providing context-aware information based on location, traffic updates, and driving status. Soon your car will be able to send you tweets updating its status while you share traffic information with other drivers.
In "Computer-Assisted Audiovisual Language Learning," Lijuan Wang and her colleagues from Microsoft Research Asia discuss the technologies underlying Engkoo, an innovative Web-based computer-assisted audiovisual language-learning service that combines two emerging speech processing technologies—talking head and phonetic similarity search. The system incorporates advanced speech, language, and multimedia technologies as a virtual tutor that 10 million people in China use to learn English on the Web. Engkoo's tutoring interaction is modeled on karaoke, a favorite pastime in China, and users learn from a photorealistic lip-synced talking head within a search and discovery ecosystem. The authors describe the text-to-speech architecture, which supports real-time video and audio rendering, and discuss Engkoo's solution to detecting and correcting input errors due to phonetic mispronunciation.
In "Cloud Computing in Taiwan," William Cheng-Chung Chu and his colleagues from several Taiwanese universities provide a comprehensive summary of the recent development in cloud computing in Taiwan. They describe the emerging industry based on different system levels, including software as a service, platform as a service, and infrastructure as a service, and discuss relevant open source efforts in support of cloud computing. The authors survey the major Taiwanese cloud computing projects and systems and explain the country's strategy to further the local cloud computing industry and its collaboration with companies worldwide. The article provides up-to-date information about the technology, policies, trends, and international collaborations in Taiwan's cloud computing research and development program.
In "Bioinformatics Applications in Genomics," Wing-Kin Sung of the National University of Singapore describes how computing technology researchers are collaborating with world-class biologists to advance the multidisciplinary study of genomics. Coupling bioinformatics with biotechnologies makes it possible for researchers to evaluate genome-wide data consisting of hundreds of billions of bits of raw data. They are using next-generation sequencing technology to reconstruct genomes and understand the gene-expression control mechanisms. This article introduces to Computer's general audience the importance of big data analytics in bioscience fields, emphasizing the significance of such research projects in improving our understanding and management of life-critical issues such as disease control.
"Fault Localization Based Only on Failed Runs" by Zhenyu Zhang, W.K. Chan, and T.H Tse introduces an effective program debugging technique. As the article notes, any measurable improvement in software developers' productivity in program testing and debugging would be enticing as more than 50 percent of software costs can be attributed to such activities. The authors present FOnly, a novel fault-localization method based on trend estimation with statistical means that uses failed runs but disregards passed runs, and which performs better or on a par in comparison with known techniques. Interested readers are encouraged to contact the authors to gain more insight into their technique and perhaps provide challenging cases for them to consider in future research.
Although these articles provide only a small-scale sampling of advances in computing in Asia, they offer a glimpse into the landscape in which computing researchers and professionals are working diligently to develop the technologies needed to meet the challenges of next-generation computing.
Carl Chang is a professor and chair of computer science at Iowa State University. His research interests include requirements engineering, software architecture, software evolution, and successful aging. He is a Fellow of IEEE and a Fellow of AAAS. Formerly, he was president of the IEEE Computer Society (2004), the editor in chief of IEEE Software (1991-1994), and editor in chief of Computer (2007-2010). Contact him at email@example.com.
Vladimir Getov is a professor of distributed and high-performance computing at the University of Westminster, London. His research interests include parallel architectures and performance, autonomous distributed computing, and high-performance programming environments. He is a senior member of IEEE, a member of ACM, a Fellow of the BCS, and Computer's area editor for high-performance computing. Contact him at firstname.lastname@example.org.
Kelvin Sung is a professor in the computing and software systems program at the University of Washington Bothell. His research interests focus on studying the role of technology in supporting human communication. His recent work is in the areas of serious games and topics related to teaching and learning foundational concepts in programming based on computer games. Sung is the editor of Computer's Entertainment Computing column. Contact him at email@example.com.