The Community for Technology Leaders
2016 International Conference on Advanced Cloud and Big Data (2016)
Chengdu, Sichuan, China
Aug. 13, 2016 to Aug. 16, 2016
ISBN: 978-1-5090-3677-6
pp: 334
ABSTRACT
After a disaster such as earthquakes, debris flows, forest fires, or landslides, etc., a lot of people have to be away from their home and gather in a shelter. In addition, the refugees suffer from the shortage of necessary resources due to impaired life infrastructure, such as damaged roads and communication networks. The degree of reducing damage depends on the amount of food, water, daily necessities and communication resources required by each shelter. How to effectively and efficiently allocate resources according to grasp the exact need of a disaster situation will be an important issue. We estimate the degree of the disaster by collecting and analyzing big data from the SNS, and building a platform for the communication resources to be efficiently and effectively allocated. In order to achieve this goal, we are challenging the following issues A) Understanding situations (user requirements) after disaster occur The SNS streams large scale semantic information about real time situation in society, especially during and after disaster. It is both domain-specific and computational challenge in processing the heterogeneous large data set to extract the exact situational content with reduced semantic uncertainty. The machine learning (ML) and natural language processing (NPL) tool kits are useful in semantic analysis, but still needs domain-specific implementation and computational improvement for the situation understanding from the SNS big data. B) Understanding distribution patterns of situations/users' requirements The disaster related situation is spatiotemporally correlated, and varies dynamically in space and time. It is also domain-specific and computational challenge in estimating the spatiotemporal distribution patterns of the disaster affect based on the spatial big data from SNS. The scan statistics such as the spatial scan have provided well tested mathematical tools and software for spatial data mining. However, new methodologies are necessary since the assumptions have to be different when it meets the spatial big data in SNS. And the computational complexity in spatial big data is also a bottleneck for real-time processing. C) Solving uncertainty of big crowd data One of the major features in big crowd data, e.g., SNS data, is uncertainty behind the data. Especially in a disaster scenario, the collecting time period cannot be long enough to smooth the data automatically. How to efficiently solve uncertainty problem in the big crowd data in a disaster scenario becomes a new and big challenge for disaster management.
INDEX TERMS
Big data, Uncertainty, Spatial databases, Semantics, Real-time systems, Data mining, Spatiotemporal phenomena
CITATION

Z. Cheng, J. Wang, N. Yen and Y. Wu, "Allocation of Resources after Disaster Based on Big Data from SNS and Spatial Scan," 2016 International Conference on Advanced Cloud and Big Data(CBD), Chengdu, Sichuan, China, 2016, pp. 334.
doi:10.1109/CBD.2016.066
94 ms
(Ver 3.3 (11022016))