2015 3rd International Conference on Future Internet of Things and Cloud (FiCloud) (2015)
Aug. 24, 2015 to Aug. 26, 2015
Combining mobile computing and cloud computing has opened the door recently for numerous applications that were not possible before due to the limited capabilities of mobile devices. Computation intensive applications are offloaded to the cloud, hence saving phone's energy and extending its battery life. However, energy savings are influenced by the wireless network conditions. In this paper, we propose considering contextual network conditions in deciding whether to offload to the cloud or not. An energy model is proposed to predict the energy consumed in offloading data under the current network conditions. Based on this prediction, a decision is taken whether to offload, to execute the application locally, or to delay offloading until detecting improvement in network conditions. We evaluated our approach by extending Think Air, a computation offloading framework proposed in , by our proposed energy model and delayed offloading algorithm. Experiments results showed considerable savings in energy with an average of 57% of the energy consumed by the application compared with the original static decision module implemented by Think Air.
Energy consumption, Bit rate, Servers, Mathematical model, Mobile communication, IEEE 802.11 Standard, Delays
M. Akram and A. ElNahas, "Energy-Aware Offloading Technique for Mobile Cloud Computing," 2015 3rd International Conference on Future Internet of Things and Cloud (FiCloud)(FICLOUD), Rome, Italy, 2015, pp. 349-356.