From the July-September 2014 issue

Evolutionary Scheduling of Dynamic Multitasking Workloads for Big-Data Analytics in Elastic Cloud

By Fan Zhang, Junwei Cao, Wei Tan, Samee U. Khan, Keqin Li, and Albert Y. Zomaya

Featured article thumbnail imageScheduling of dynamic and multitasking workloads for big-data analytics is a challenging issue, as it requires a significant amount of parameter sweeping and iterations. Therefore, real-time scheduling becomes essential to increase the throughput of many-task computing. The difficulty lies in obtaining a series of optimal yet responsive schedules. In dynamic scenarios, such as virtual clusters in cloud, scheduling must be processed fast enough to keep pace with the unpredictable fluctuations in the workloads to optimize the overall system performance. In this paper, ordinal optimization using rough models and fast simulation is introduced to obtain suboptimal solutions in a much shorter timeframe. While the scheduling solution for each period may not be the best, ordinal optimization can be processed fast in an iterative and evolutionary way to capture the details of big-data workload dynamism. Experimental results show that our evolutionary approach compared with existing methods, such as Monte Carlo and Blind Pick, can achieve higher overall average scheduling performance, such as throughput, in real-world applications with dynamic workloads. Furthermore, performance improvement is seen by implementing an optimal computing budget allocating method that smartly allocates computing cycles to the most promising schedules.

download PDF View the PDF of this article      csdl View this issue in the digital library


Editorials and Announcements

Announcements

  • EICs Undergoing Reappointment for 2016-2017 Terms: IEEE Computer Society publications have editors in chief who are currently standing for reap­pointment to a second two-year term. The Publications Board invites comments upon the tenures of the individual editors. Please click here for more details.

  • A Welcome Letter from Thomas M. Conte (PDF)

  • We are pleased to announce that Fabrizio Lombardi, a professor at Northeastern University, Boston, has been appointed as the inaugural EIC for the IEEE Transactions on Emerging Technologies in Computing, effective immediately. Dr. Lombardi is an IEEE fellow, a member of the Computer Society Board of Governors, and is a past EIC and Associate EIC of the IEEE Transactions on Computers.

Editorials


Call for Papers

Special Issue on Circuit and System Design Methodologies for Emerging Technologies

Submission deadline: February 1, 2015. View PDF.

The demand for ever smaller, portable, low-power and high-performance electronic systems has been the primary driver for CMOS technology scaling. As CMOS scaling approaches physical limits, it has been fraught with challenges that required introduction of newer processes and materials. High-? oxide and metal-gate stack was introduced to mitigate oxide leakage. Thin body, undoped channels were introduced mitigate subthreshold leakage. 3D transistors such as FinFET and trigates were introduced to improve ON current while maintaining layout efficiency. While these incremental adjustments have allowed CMOS technology to scale, a number of alternative devices have been proposed to replace CMOS transistors such as graphene transistors (GFET), tunnel transistors, graphene nanoribbon tunnel transistors, quantum-dots and single-electron devices (SET). Newer memory technologies such as resistive RAMs, memristors, STT-RAMs similarly promise to revolutionize the design landscape. However for these alternative technologies to become practical, design methodologies that allow efficient modeling, design space exploration, and trade-off analysis is crucial. This is the driving motivation for this special issue.

Special Issue on Emerging Security Trends for Deeply-Embedded Computing Systems

Submission deadline: February 1, 2015. View PDF.

Unlike traditional embedded systems, nowadays, emerging computing systems are embedded in every aspect of human lives. These deeply-embedded computing systems often perform extremely sensitive tasks, and in some cases, such as health-care IT, these are life-saving. Thus, in addition to the security threats to traditional embedded systems, emerging deeply-embedded computing systems exhibit a larger attack surface, prone to more serious or life-threatening malicious attacks. These call for revisiting traditional security mechanisms not only because of the new facets of threats and more adverse effects of breaches, but also due to the resource limitations of these often-battery-powered and extremely-constrained computing systems. As such, new trends for providing security for deeply embedded systems are emerging; many of which abandoning use of cryptographic computations or make use of lightweight crypto-systems, feasible for these computing platforms. Indeed, there exists paramount potential for applying these emerging security approaches to sensitive applications such as health-care IT for implantable medical devices, big data analytics and machine learning in deeply embedded systems, smart buildings, and smart fabrics. The focus of this special issue will be on novel security methods for deeply-embedded computing systems, emerging cryptographic solutions applicable to extremely-constrained applications such as green cryptography, and advancements in feasible security measures for evolving interdisciplinary research trends such as computing for: health-care IT, cyber-physical embedded systems, big data, and smart buildings/fabrics.

Special Issue on Advances in Mobile Cloud Computing

Submission deadline: March 1, 2015. View PDF.

There is a phenomenal burst of research activities in mobile cloud computing, which extends cloud computing functions, services, and results to the world of future mobile communications applications, and the paradigm of cloud computing and virtualization to mobile networks. Mobile applications demand greater resources and improved interactivity for better user experience. Resources in cloud computing platforms such as Amazon, Google AppEngine and Microsoft Azure are a natural fit to remedy the lack of local resources in mobile devices. The availability of cloud computing resources on a pay-as-you-go basis, the advances in network virtualization, software defined networks, and the emergence of advanced wireless networks such as cloud-based radio access networks (C-RANs) create a new space of rich research problems. The objective of this special section is to cover the most recent research and development on the technologies for mobile cloud computing. This special section is to offer a venue for industry and academia to show case their recent progresses and potential research directions on the mobile cloud computing technologies.

Special Issue on Emerging Trends in Education

Submission deadline: March 1, 2015. View PDF.

 

Technological advancements, such as those seen in cloud computing, mobile devices, and big, open and linked data, to name just a few, bring with them great opportunities for broadening the reach of, and enriching, the educational experience. For instance, virtual learning environments are becoming commonplace in the communication between students and teachers, who can use a plethora of web-based tools and applications to publish assignments and submit them for grading, perform automatic assessment, etc. At the same time, mobile computing is contributing at expanding the reach of learning content and frameworks, which are becoming accessible in an always-on and ready-to-go fashion. By leveraging on smartphones and tablets, new pedagogical tools are being implemented which exploit their innovative interaction capabilities and the rich set of sensors to create immersive and interactive experiences not previously possible. Furthermore, the massive volume of information produced in these tools and environments opens greater possibilities including the sharing, analysis, and visualization of education data patterns and their trends. Although there are many different visions for education in the future, great efforts will be needed to reach a profound integration between the technologies that are already well-established and those that are considered as emerging. By building on a solid scientific and methodological foundation where theory and practice converge, this special issue aims to present both the current trends that characterize the learning and teaching domains of today as well the expected evolution that will shape the education of tomorrow.

Special Issue on Big Data Benchmarks, Performance Optimization, and Emerging Hardware

Submission deadline: June 1, 2015. View PDF.

Big data are emerging as a strategic property of nations and organizations. There are driving needs to generate values from big data. However, the sheer volume of big data requires significant storage capacity, transmission bandwidth, computation, and power consumption. It is expected that systems with unprecedented scales can resolve the problems caused by varieties of big data with daunting volumes. Nevertheless, without big data benchmarks, it is very difficult for big data owners to make a decision on which system is best for meeting with their specific requirements. They also face challenges on how to optimize the systems for specific or even comprehensive workloads. Meanwhile, researchers are also working on innovative data management systems, hardware architectures, and operating systems to improve performance in dealing with big data. This focus of this special issue will be on architecture and system support for big data systems.

Special Issue on Methods and Techniques for Processing Streaming Big Data in Datacentre Clouds

Submission deadline: June 1, 2015. View PDF.

Internet of Things (IoT) is a part of Future Internet and comprises many billions of Internet connected Objects (ICOs) or ‘things' where things can sense, communicate, compute and potentially actuate as well as have intelligence, multi-modal interfaces, physical/ virtual identities and attributes. ICOs can include sensors, RFIDs, social media, actuators (such as machines/equipments fitted with sensors) as well as lab instruments (e.g., high energy physics synchrotron), and smart consumer appliances (smart TV, smart phone, etc.). The IoT vision has recently given rise to IoT big data applications that are capable of producing billions of data stream and tens of years of historical data to support timely decision making. Some of the emerging IoT big data applications, e.g. smart energy grids, syndromic bio-surveillance, environmental monitoring, emergency situation awareness, digital agriculture, and smart manufacturing, need to process and manage massive, streaming, and multi-dimensional (from multiple sources) data from geographically distributed data sources.

Despite recent technological advances of the data-intensive computing paradigms (e.g. the MapReduce paradigm, workflow technologies, stream processing engines, distributed machine learning frameworks) and datacentre clouds, large-scale reliable system-level software for IoT big data applications are yet to become commonplace. As new diverse IoT applications begin to emerge, there is a need for optimized techniques to distribute processing of the streaming data produced by such applications across multiple datacentres that combine multiple, independent, and geographically distributed software and hardware resources. However, the capability of existing data-intensive computing paradigms is limited in many important aspects such as: (i) they can only process data on compute and storage resources within a centralised local area network, e.g., a single cluster within a datacentre. This leads to unsatisfied Quality of Service (QoS) in terms of timeliness of decision making, resource availability, data availability, etc. as application demands increase; (ii) they do not provide mechanisms to seamlessly integrate data spread across multiple distributed heterogeneous data sources (ICOs); (iii) lack support for rapid formulation of intuitive queries over streaming data based on general purpose concepts, vocabularies and data discovery; and (iv) they do not provide any decision making support for selecting optimal data mining and machine algorithms, data application programming frameworks, and NoSQL database systems based on nature of the big data (volume, variety, and velocity). Furthermore, adoption of existing datacentre cloud platform for hosting IoT applications is yet to be realised due to lack of techniques and software frameworks that can guarantee QoS under uncertain big data application behaviours (data arrival rate, number of data sources, decision making urgency, etc.), unpredictable datacentre resource conditions (failures, availability, malfunction, etc.) and capacity demands (bandwidth, memory, storage, and CPU cycles). It is clear that existing data intensive computing paradigms and related datacentre cloud resource provisioning techniques fall short of the IoT big data challenge or do not exist.

Special Issue on Approximate and Stochastic Computing Circuits, Systems and Algorithms

Submission deadline: September 1, 2015. View PDF.

The last decade has seen renewed interest in non-traditional computing paradigms. Several (re-)emerging paradigms are aimed at leveraging the error resiliency of many systems by releasing the strict requirement of exactness in computing. This special issue of TETC focuses on two specific lines of research, known as approximate and stochastic computing.

Approximate computing is driven by considerations of energy efficiency. Applications such as multimedia, recognition, and data mining are inherently error-tolerant and do not require perfect accuracy in computation. The results of signal processing algorithms used in image and video processing are ultimately left to human perception. Therefore, strict exactness may not be required and an imprecise result may suffice. In these applications, approximate circuits aim to improve energy-efficiency by maximally exploiting the tolerable loss of accuracy and trading it for energy and area savings.

Stochastic computing is a paradigm that achieves fault-tolerance and area savings through randomness. Information is represented by random binary bit streams, where the signal value is encoded by the probability of obtaining a one versus a zero. The approach is applicable for data intensive applications such as signal processing where small fluctuations can be tolerated but large errors are catastrophic. In such contexts, it offers savings in computational resources and provides tolerance to errors. This fault tolerance scales gracefully to high error rates. The focus of this special issue will be on the novel design and analysis of approximate and stochastic computing circuits, systems, algorithms and applications.

Special Issue/Section on Low-Power Image Recognition

Submission deadline: September 1, 2015. View PDF.

Digital images have become integral parts of everyday life. It is estimated that 10 million images are uploaded to social networks each hour and 100 hours of video uploaded for sharing each minute. Sophisticated image / video processing has fundamentally changed how people interact. For example, automatic classification or tagging can mediate how photographs are disseminated to friends. Many of today's images are captured using smartphones, and cameras in smartphones can be used for a wide range of imaging applications, from high-fidelity location estimation to posture analysis. Image processing is computationally intense and can consume significant amounts of energy on mobile systems. This special issue focuses on the intersection of image recognition and energy conservation. Papers should describe energy efficient systems that perform object detection and recognition in images.

Special Issue/Section New Paradigms in Ad Hoc, Sensor and Mesh Networks, From Theory to Practice

Submission deadline: December 1, 2015. View PDF.

Ad hoc, sensor and mesh networks have attracted significant attention by academia and industry in the past decade. In recent years however new paradigms have emerged due to the large increase in number and processing power of smart phones and other portable devices. Furthermore, new applications and emerging technologies have created new research challenges for ad hoc networks. The emergence of new operational paradigms such as Smart Home and Smart City, Body Area Networks and E-Health, Device-to-Device Communications, Machine-to-Machine Communications, Software Defined Networks, the Internet of Things, RFID, and Small Cells require substantial changes in traditional ad hoc networking. The focus of this special issue is on novel applications, protocols and architectures, non-traditional measurement, modeling, analysis and evaluation, prototype systems, and experiments in ad hoc, sensor and mesh networks.

General Call for Papers: IEEE Transactions on Emerging Topics in Computing

Submit your manuscript at www.computer.org/tetc. TETC aggressively seeks proposals for Special Sections and Issues focusing on emerging topics. TETC is an open access journal, which allows for wider dissemination of information. Prospective Guest Editors should contact the TETC EIC Fabrizio Lombardi at lombardi@ece.neu.edu for further details.

View complete call for papers.


Access recently published TETC Articles

RSS Subscribe to the RSS Feed of latest TETC Content Added to the Digital Library.

Mail Sign up for the Transactions Connection Newsletter.


 

IEEE Transactions on Emerging Topics in Computing (TETC) is now accepting manuscript submissions. To submit your manuscript, please use the ScholarOne Manuscripts manuscript submission site.