Entries with tag conference news.

[Conference News] Making the Most of Structured Data on the Web

The combination of a rich repository of structured data on the Web coupled with new tools for information management and visualization are enabling structured data to have a profound impact on many aspects of our lives.

The success of this trend depends on solving several longstanding data-management problems. We must continue developing tools that enable a broader set of users to manage data and create compelling visualizations. We need methods for identifying high-quality data from the Web and other corpora. And we should be able to recover these datasets’ semantics well enough for them to be displayed for user queries and combined with other datasets.

At the 2013 IEEE 29th International Conference on Data Engineering (ICDE 2013), scientists from Google Research USA presented a paper discussing Google Fusion Tables, a cloud-based service that aims to support an ecosystem of structured data on the Web by providing a tool for managing and visualizing data on the one hand, and for searching and exploring for data on the other.

Recent Progress towards an Ecosystem of Structured Data on the Web” and other papers from ICDE 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

[Conference News] Making Body Sensor Networks More Efficient

Body sensor networks consist of a group of wireless sensors with various monitoring capacities. The system transmits data that BSNs collect for analysis by, for example, doctors. In a BSN activity-recognition system, sensor sampling and communications quickly deplete battery reserves. Reducing sampling and communication saves energy, but usually at the cost of reduced recognition accuracy.

At the 2013 IEEE 19th Real-Time and Embedded Technology and Applications Symposium (RTAS), researchers from the College of William and Mary presented a paper proposing AdaSense, a framework that reduces the BSN sensors sampling rate while meeting a user-specified accuracy requirement. AdaSense utilizes a classifier set to perform either multiactivity classification that requires a high sampling rate or single-activity event detection that demands a very low sampling rate. Furthermore, it uses a novel genetic-programming algorithm to determine optimal sampling rates.

AdaSense: Adapting Sampling Rates for Activity Recognition in Body Sensor Networks” and other papers from ICNC 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

[Conference News] Middleware Keeps Up with Theory for Real-Time Multicore Scheduling

Real-time scheduling theory aimed at multicore systems has become increasingly sophisticated and diverse as multicore computing hardware becomes more prevalent. Real-time operating systems (RTOSs) are ill-suited for this kind of rapid change, and the slow-moving RTOS ecosystem is falling behind advances in real-time scheduling theory.

In a paper presented at the 2013 IEEE 19th Real-Time and Embedded Technology and Applications Symposium (RTAS 2013), researchers from the University of North Carolina at Chapel Hill describe a middleware solution running in userspace — that is, outside the RTOS kernel. The solution supports preemptive, dynamic-priority, migrating real-time tasks on multicore hardware. Empirical latency and overhead measurements on an eight-core Intel Xeon platform are in the range of ones to tens of microseconds under most-tested configurations. They see this approach as potentially superior to a kernel-based approach for a subset of future real-world real-time applications.

Bringing Theory into Practice: A Userspace Library for Multicore Real-Time Scheduling” and other papers from RTAS 2013 are available to IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.
 

[Conference News] Extracting Hidden Behavioral Patterns from Social Network Data

Massive information about human behavior is continuously generated by Web-based services, both public and private. The data include traces of not only individual activities but also collaborative work, and the social networks that can be extracted from these datasets offer a kind of knowledge that’s independent of user awareness.

In a paper presented at the 2013 International Conference on Social Intelligence and Technology (Social 2013), researchers from the Wroclaw University of Technology in Poland describe a data-driven approach to social network analysis that enables various applications of knowledge about human behavior. They illustrate selected models and analytical methods in applications to recommender systems, organizational structure analysis, and social group evolution.

From Data to Human Behaviour” and other papers from Social 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

[Conference News] Reducing Overhead in Named Data Manets

Named Data Networks (NDNs) use data names instead of host addresses to locate data. The NDN architecture assumes pull-based forwarding and a one-interest-one-data principle. To initiate a data transfer, a data consumer must send an Interest Packet to request the corresponding data packet. NDN’s chunk-based caching feature is beneficial in coping with the mobility and intermittent connectivity challenges in Mobile Ad Hoc Networks (Manets).

In a paper presented at the 2013 International Conference on Computing, Networking and Communications (ICNC 2013), researchers from the University of California, Los Angeles, and IBM T.J. Watson Research Center describe a study of Named Data Manet (NDM) forwarding designs. They propose the Neighborhood-Aware Interest Forwarding (NAIF) design to reduce the bandwidth usage induced by indiscriminate interest flooding, which is a problem in other NDM forwarding designs. They present results showing that NAIF reduces bandwidth usage by up to 54 percent compared to other approaches.

Interest Propagation in Named Data Manets” and other ICNC 2013 papers are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

[Conference News] Enabling Survivability in Cloud-Networking Services

As cloud computing services expand across interconnected datacenters, reliability and survivability are becoming major concerns among users. Current failure-recovery strategies aren’t always effective against large failures, so survivable virtual network (VN) mapping design is of key interest.

At the 2013 International Conference on Computing, Networking and Communications (ICNC 2013), researchers from Cisco Systems, Kuwait University, and the University of New Mexico presented a paper proposing a way to compute VN mappings so that each service request can recover from a single regional failure.

Survivable Cloud Networking Services” and other papers from ICNC 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

[Conference News] Improving Mashup Quality

Web mashups are a new generation of applications based on the composition of ready-to-use, heterogeneous components. They have potential to evolve users from passive recipients to active creators of applications. However, some issues are still largely unexplored, particularly those related to quality and identifying adequate components to use in mashups.

At the 2012 Eighth International Conference on the Quality of Information and Communications Technology (Quatic 2012), Italian researchers presented a paper reviewing ways to capture the intrinsic quality of mashup components, as well as the components’ capacity to maximize the final application’s quality and value. The authors also propose a process in which quality becomes the driver for suggesting how users how can complete mashups based on the integration of quality-assessment and recommendation techniques within a development tool.

Quality-Aware Mashup Composition: Issues, Techniques and Tools” and other papers from Quatic 2012 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.
 

[Conference News] Standards-Based Integration of Test and Risk Management

Enterprises have started to establish dedicated risk management (RM) functions to address risks from different sources, such as currency exchange and credit risk. In most companies and projects, RM and testing functions operate independently.

However, in a paper presented at the 2012 Eighth International Conference on the Quality of Information and Communications Technology (Quatic 2012), researchers from SQS Software Quality Systems AG demonstrate the relationships between RM and test management (TM). They further describe an integration to leverage the RM-TM synergies based on two widely used standards for the respective disciplines.

Integrating Test and Risk Management" and other papers from Quatic 2012 are available to both IEEE Computer Society members and paid subscribers via the Computer Science Digital Library.
 

[Conference News] Pattern-Based Programming for Many-Core Accelerators

Efficient programming of general-purpose many-core accelerators poses several challenging problems including peculiarities of the interconnection network and the complex memory hierarchy organization.

In a paper presented at 21st Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2013), researchers from the University of Pisa propose parallel design patterns, implemented using algorithmic skeletons, as a means to abstract and hide most of these challenges. Specifically, they ported the FastFlow framework on the Tilera TilePro64 architecture. Results obtained from running synthetic benchmarks and true application kernels demonstrate the efficiencies achieved both in programming stand-alone skeleton-based parallel applications and in accelerating existing sequential code.

Parallel Patterns for General Purpose Many-Core” and other papers from PDP 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.
 

[Conference News] New Approach Reduces JavaScript Applications’ Memory Problems

JavaScript is the dominant language for implementing dynamic webpages in browsers. Although standardized, many browsers implement its language and browser bindings in incompatible ways. As a result, many Web development frameworks have been developed to hide cross-browser issues and ease development of large Web applications. An unwelcome side effect of these frameworks is that they can introduce memory leaks, despite JavaScript’s garbage collection.

At the 2013 IEEE/ACM International Symposium on Code Generation and Optimization (CGO 2013), researchers from Purdue University and Google presented a paper describing a compiler extension that helps address this problem. JSWhiz extends the open-source Closure JavaScript compiler that detects common types of problems that cause leaks. In an analysis of numerous Google applications, including Closure, it found a total of 89 memory leaks. JSWhiz also contributed significantly in a recent effort to reduce Gmail’s bloat and memory footprint.

JSWhiz: Static Analysis for JavaScript Memory Leaks” and other papers from CGO 2013 are available to both IEEE Computer Society members and paid subscribers via the Computer Society Digital Library.

Showing 1 - 10 of 67 results.
Items per Page 10
of 7