Research Article | Open Access | Download PDF
Volume 10 | Number 1 | Year 2014 | Article Id. IJCTT-V10P107 | DOI : https://doi.org/10.14445/22312803/IJCTT-V10P107
Ensemble Classifiers and Their Applications: A Review
Akhlaqur Rahman , Sumaira Tasnim
Citation :
Akhlaqur Rahman , Sumaira Tasnim, "Ensemble Classifiers and Their Applications: A Review," International Journal of Computer Trends and Technology (IJCTT), vol. 10, no. 1, pp. 31-35, 2014. Crossref, https://doi.org/10.14445/22312803/ IJCTT-V10P107
Abstract
Ensemble classifier refers to a group of individual classifiers that are cooperatively trained on data set in a supervised classification problem. In this paper we present a review of commonly used ensemble classifiers in the literature. Some ensemble classifiers are also developed targeting specific applications. We also present some application driven ensemble classifiers in this paper.
Keywords
Ensemble classifier, Multiple classifier systems, Mixture of experts.
References
[1] R. Polikar, Ensemble based systems in decision making, IEEE Circuits and Systems Magazine, 6 (3) (2006), pp. 21–45
[2] E.K. Tang, P.N. Suganthan, X. Yao, An analysis of diversity measures, Machine Learning, 65 (2006), pp. 247–271
[3] G. Brown, J.L. Wyatt, R. Harris, X. Yao, Diversity creation methods: a survey and categorization, Information Fusion, 6 (1) (2005), pp. 5–20
[4] L.I. Kuncheva, C.J. Whitaker, Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy, Machine Learning, 51 (2) (2003), pp. 181–207
[5] A Basic Introduction to Neural Networks. <http://www.pages.cs.wisc.edu/~bolo/shipyard/neural/local.html> (accessed August 2012).
[6] MathWorks, Support Vector Machines (SVM). <http://www.mathworks.com.au/help/toolbox/bioinfo/ug/bs3tbev-1.html> (accessed August 2012).
[7] R. Maclin, J.W. Shavlik, Combining the predictions of multiple classifiers: using competitive learning to initialize neural networks, in: International Joint Conference on Artificial Intelligence, 1995, pp. 524–531.
[8] T. Yamaguchi, K.J. Mackin, E. Nunohiro, J.G. Park, K. Hara, K. Matsushita, M. Ohshiro, K. Yamasaki, Artificial neural network ensemble-based land-cover classifiers using MODIS data, Artificial Life and Robotics, 13 (2) (2009), pp. 570–574.
[9] H. Chen, X. Yao, Regularized negative correlation learning for neural network ensembles, IEEE Transactions on Neural Networks, 20 (12) (2009), pp. 1962–1979.
[10] H. Chen, X. Yao, Multiobjective neural network ensembles based on regularized negative correlation learning, IEEE Transactions on Knowledge and Data Engineering, 22 (12) (2010), pp. 1738–1751.
[11] T.K. Ho, The random subspace method for constructing decision forests, IEEE Transaction on Pattern Analysis and Machine Intelligence, 20 (8) (1998), pp. 832–844.
[12] A. Bertoni, R. Folgieri, G. Valentini, Bio-molecular cancer prediction with random subspace ensembles of support vector machines, Neurocomputing, 63 (2005), pp. 535– 539.
[13] L.I. Kuncheva, J.J. Rodriguez, C.O. Plumpton, D.E. Linden, S.J. Johnston, Random subspace ensembles for FMRI classification, IEEE Transaction on Medical Imaging, 29 (2) (2010), pp. 531–542.
[14] G. Martínez-Muñoz, A. Sánchez-Martínez, D. Hernández-Lobato, A. Suarez, Class-switching neural network ensembles, Neurocomputing, 7 (2008), pp. 2521–2528.
[15] T.G. Dietterich, G. Bakiri, Solving multiclass learning problems via error-correcting output codes, Journal of Artificial Intelligence Research, 2 (1995), pp. 263–286
[16] L. Rokach, O. Maimon, I. Lavi, Space decomposition in data mining: a clustering approach, in: International Symposium on Methodologies for Intelligent Systems, 2003, pp. 24–31.
[17] J. Xiuping, J.A. Richards, Cluster-space classification: a fast k-nearest neighbour classification for remote sensing hyperspectral data, in: IEEE Workshop on Advances in Techniques for Analysis of Remotely Sensed Data, 2003, pp. 407–410.
[18] L.I. Kuncheva, Cluster-and-selection method for classifier combination, in: International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies (KES), 2000, pp. 185–188.
[19] B. Tang, M.I. Heywood, M. Shepherd, Input partitioning to mixture of experts, in: International Joint Conference on Neural Networks, 2002, pp. 227–232.
[20] G. Nasierding, G. Tsoumakas, A.Z. Kouzani, Clustering based multi-label classification for image annotation and retrieval, in: IEEE International Conference on Systems, Man and Cybernetics, 2009, pp. 4514–4519.
[21] S. Eschrich, L.O. Hall, Soft partitions lead to better learned ensemble, in: Annual meeting of the North American fuzzy information processing society (NAFIPS), 2002, pp. 406–411.
[22] M.J. Jordan, R.A. Jacobs, Hierarchical mixtures of experts and the EM algorithm, Neural Computation, 6 (2) (1994), pp. 181–214.
[23] A. Rahman and B. Verma, Cluster Based Ensemble of Classifiers, Wiley Expert Systems, DOI: DOI: 10.1111/j.1468-0394.2012.00637.x, 2012.
[24] B. Verma and A. Rahman, Cluster Oriented Ensemble Classifier: Impact of Multi-cluster Characterisation on Ensemble Classifier Learning, IEEE Transaction on Knowledge and Data Engineering, vol. 24, no. 4, pp. 605–618, 2012.
[25] A. Rahman and B. Verma, A Novel Layered Clustering based Approach for Generating Ensemble of Classifiers, IEEE Transaction on Neural Networks, vol 22, no 5, pp 781– 792, 2011.
[26] A. Rahman and B. Verma, A Novel Ensemble Classifier Approach using Weak Classifier Learning on Overlapping Clusters, Proc. IEEE International Joint Conference on Neural Networks (IJCNN), Barcelona, Spain, 2010.
[27] A. Rahman and B. Verma, Influence of Unstable Patterns in Layered Cluster Oriented Ensemble Classifier, Proc. IEEE International Joint Conference on Neural Networks (IJCNN), Brisbane, Australia, 2012.
[28] A. Rahman and B. Verma, Cluster Based Ensemble Classifier Generation by Joint Optimization of Accuracy and Diversity, International Journal of Computational Intelligence and Applications, vol. 12, no. 4, DOI: 10.1142/S1469026813400038, 2013.
[29] A. Rahman and B. Verma, Ensemble Classifier Generation using Non–uniform Layered Clustering and Genetic Algorithm, Elsevier Knowledge Based Systems, vol. 43 (May 2013), pp. 30–42, 2013.
[30] A. Rahman and B. Verma, Cluster Oriented Ensemble Classifiers using Multi–Objective Evolutionary Algorithm, Proc. IEEE International Joint Conference on Neural Networks (IJCNN), pp. 829–834, Dallas, Texas, 2013.
[31] A. Rahman, B. Verma, and X. Yao, Non–uniform Layered Clustering for Ensemble Classifier Generation and Optimality, 19th International Conference on Neural Information Processing (ICONIP 2012): Lecture Notes in Computer Science, vol. 6443, pp. 551–558, 2010.
[32] L. Breiman, Bagging predictors, Machine Learning, 24 (2) (1996), pp. 123–140
[33] L. Breiman, Random forests, Machine Learning, 45 (1) (2001), pp. 5–32
[34] L. Breiman, Pasting small votes for classification in large databases and on-line, Machine Learning, 36 (1999), pp. 85–103
[35] R.E. Schapire, The strength of weak learnability, Machine Learning, 5 (2) (1990), pp. 197–227
[36] Y. Freund, R.E. Schapire, Decision-theoretic generalization of on-line learning and an application to boosting, Journal of Computer and System Sciences, 55 (1) (1997), pp. 119–139
[37] A. Rahman, D. Smith, and G. Timms, A Novel Machine Learning Approach towards Quality Assessment of Sensor Data, IEEE Sensors Journal, DOI: 10.1109/JSEN.2013.2291855.
[38] A. Rahman, D. Smith, and G. Timms Multiple Classifier System for Automated Quality Assessment of Marine Sensor Data, Proceedings IEEE Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), pp. 362–367, Melbourne, 2013.
[39] C. D’ Este, A. Rahman, and A. Turnbull, Predicting Shellfish Farm Closures with Class Balancing Methods, AAI 2012: Advances in Artificial Intelligence, Lecture Notes in Computer Science, pp. 39–48, 2012.
[40] A. Rahman, C. D`Este, and J. McCulloch, Ensemble Feature Ranking for Shellfish Farm Closure Cause Identification, Workshop on Machine Learning for Sensory Data Analysis hosted with Australian AI conference, DOI: 10.1145/2542652.2542655, 2013.
[41] A. Rahman and B. Verma, Effect of Ensemble Classifier Composition on Offline Cursive Character Recognition, Elsevier Information Processing & Management, vol. 49, issue 4, July 2013, pp. 852–864, 10.1016/j.ipm.2012.12.010, 2013.
[42] A. Rahman and B. Verma, Ensemble Classifier Composition: Impact on Feature Based Offline Cursive Character Recognition, Proc. IEEE International Joint Conference on Neural Networks (IJCNN), San Jose, USA, 2011.
[43] A. Rahman, Benthic Habitat Mapping from Seabed Images using Ensemble of Color, Texture, and Edge Features, International Journal of Computational Intelligence Systems, vol. 6, no. 6, pp. 1072-1081, 2013.
[44] A. Rahman, Claire D’ Este, and G. Timms, Dealing with Missing Sensor Values in Predicting Shellfish Farm Closure, Proceedings IEEE Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), pp. 351–356, Melbourne, 2013.
[45] Q. Zhang, A. Rahman, and C. D`Este, Impute vs. Ignore: Missing Values for Prediction, Proc. IEEE International Joint Conference on Neural Networks (IJCNN), pp. 2193– 2200, Dallas, Texas, 2013.
[46] A. Rahman and M. Murshed, “Feature weighting methods for abstract features applicable to motion based video indexing,” IEEE International Conference on Information Technology: Coding and Computing (ITCC), vol. 1, pp. 676–680, USA, 2004.
[47] A. Rahman and MS Shahriar, Algae Growth Prediction through Identification of Influential Environmental Variables: A Machine Learning Approach, International Journal of Computational Intelligence and Applications, vol. 12, no. 2, DOI: 10.1142/S1469026813500089, 2013.