2010 14th Panhellenic Conference on Informatics (2010)
Sept. 10, 2010 to Sept. 12, 2010
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/PCI.2010.47
Hadoop is a fault tolerant Java framework that supports data distribution and process parallelization using commodity hardware. Based on the provided scalability and the independence of task execution, we combined Hadoop with crawling techniques to implement various applications that deal with large amount of data. Our experiments show that Hadoop is a very useful and trustworthy tool for creating distributed programs that perform better in terms of computational efficiency.
parallelization, distributed systems, Hadoop, commodity hardware, live data collection
K. Chalkias, K. Talattinis, G. Stephanides and A. Sidiropoulou, "Parallel Collection of Live Data Using Hadoop," 2010 14th Panhellenic Conference on Informatics(PCI), Tripoli, Greece, 2010, pp. 66-71.