The Community for Technology Leaders
2015 International Conference on Big Data and Smart Computing (BigComp) (2015)
Jeju, South Korea
Feb. 9, 2015 to Feb. 11, 2015
ISBN: 978-1-4799-7303-3
pp: 27-30
Young-Seob Jeong , Department of Computer Science, KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701, South Korea
GiRyong Choi , Department of Computer Science, KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701, South Korea
Ho-Jin Choi , Department of Computer Science, KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701, South Korea
Youssef Iraqi , Khalifa University, Sharjah, United Arab Emirates
ABSTRACT
Due to the growing number of unlabeled documents, it is becoming important to develop unsupervised methods capable of automatically extracting information. Topic models and neural networks represent two such methods, and parameter approximation algorithms are typically employed to estimate the parameters because it is not possible precisely to compute the parameters when using these methods. One of the well-known weaknesses of these approximation algorithms is that they do not find the global optimum but instead find one of many local optima. It is also known that initialization of the parameters affects the results of the parameter approximation process. In this paper, we hypothesize that the order of data class is also a factor that affects the parameter approximation results. Through digit recognition experiments with MNIST data, we prove that this hypothesis is valid and argue that it will be better always to use fully shuffled data to avoid incorrect conclusions.
INDEX TERMS
Approximation methods, Indexes, Approximation algorithms, Support vector machines, Accuracy, Neural networks, Training
CITATION

Y. Jeong, G. Choi, H. Choi and Y. Iraqi, "Relationship between class order and parameter approximation in unsupervised learning," 2015 International Conference on Big Data and Smart Computing (BigComp)(BIGCOMP), Jeju, South Korea, 2015, pp. 27-30.
doi:10.1109/35021BIGCOMP.2015.7072844
98 ms
(Ver 3.3 (11022016))