Issue No. 07 - July (2011 vol. 22)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPDS.2010.159
Chun-Lung Lin , Tsing Hua University, Hsinchu
Chen-Lung Chan , Tsing Hua University, Hsinchu
Jia-Shung Wang , Tsing Hua University, Hsinchu
Lossy compression techniques are commonly used by long-term data-gathering applications that attempt to identify trends or other interesting patterns in an entire system since a data packet need not always be completely and immediately transmitted to the sink. In these applications, a nonterminal sensor node jointly encodes its own sensed data and the data received from its nearby nodes. The tendency for these nodes to have a high spatial correlation means that these data packets can be efficiently compressed together using a rate-distortion strategy. This paper addresses the optimal rate-distortion allocation problem, which determines an optimal bit rate of each sensor based on the target overall distortion to minimize the network transmission cost. We propose an analytically optimal rate-distortion allocation scheme, and we also extend it to a distributed version. Based on the presented allocation schemes, a greedy heuristic algorithm is proposed to build the most efficient data transmission structure to further reduce the transmission cost. The proposed methods were evaluated using simulations with real-world data sets. The simulation results indicate that the optimal allocation strategy can reduce the transmission cost to 6\sim 15\% of that for the uniform allocation scheme.
Sensor networks, compression, rate-distortion allocation, distributed applications, optimization, transform coding.
C. Lin, J. Wang and C. Chan, "Optimization of Rate Allocation with Distortion Guarantee in Sensor Networks," in IEEE Transactions on Parallel & Distributed Systems, vol. 22, no. , pp. 1230-1237, 2010.