Total views : 299
Developing a Hybrid Intelligent Classifier by using Evolutionary Learning (Genetic Algorithm and Decision Tree)
Objective: The objective of this paper is to give a hybrid classifier by combining the genetic algorithm and decision tree based on evolutionary learning. Methods: The proposed algorithm on the 8 data samples was tested. In order to implement the proposed algorithm, MATLAB software was used. In all the obtained results, standardized data sets are used, making assembly by using genetic algorithm which is very suitable. Results: The learning technique of sub-spaces is proposed. In this study, we tried to compare a series of different methods and updated of integrated distribution. It showed that, in cases that the number of information or the number of properties are high, the proposed hybrid classification approach that implements genetic algorithm can be used as the best approach. Conclusion: In this study, we tried a usual approach for clustering in error prone environments. A main excess in the precision on the tested information or on the validation is clear. It should be noted that this increasing is in comparison with the assembly classifiers which has stable accuracy.
Combination of Genetic and Decision Tree, Consensus of Classifiers.
- Kuncheva L. Combining pattern classifiers, methods and algorithms. New York: Wiley; 2005.
- Fukunaga K. Introduction to statistical pattern recognition. Orlando, FL: Academic Press, Inc; 1972.
- Vapnik V, Lerner A. Pattern recognition using generalized portrait method. Automation and Remote Control. 2009; 24:774–80.
- Japkowicz N, Stephen S. The class imbalance problem: A systematic study. Intelligent Data Analysis Journal. 2002; 6(5):429–49.
- Akbani R, Kwek S, Jakowicz N. Applying support vector machines to imbalanced datasets. Proceedings of European Conference on Machine Learning; Pisa, Italy. 2014. p. 39–50.
- Wu G, Chang EY. Class-boundary alignment for imbalanced dataset learning. Proceedings of the ICML’03 Workshop on Learning from Imbalanced Data Sets; Washington, DC. 2003.
- Skalak B. Prototype and feature selection by sampling and random mutation hill climbing algorithms. Proc 11th Int Conf on Machine Learning; New Brunswick, N.J. Morgan Kaufmann, Los Altos, CA. 1994. p. 293–301.
- Kohonen T. Improved versions of learning vector quantization. Proc International Joint Conference on Neural Networks; San Diego, CA. 1990 Jun. p. I545–50.
- Godara S, Singh R. Evaluation of predictive machine learning techniques as expert systems in medical diagnosis. Indian Journal of Science and Technology. 2016 Mar; 9(10):1–14.
- Marquis JA. Sur les elections par scrutiny. Histoire de l’Academie Royale des Sciences. 1984; 44:31–4.
- Montazeri-Gh M, Mahmoodi-k M. An optimal energy management development for various configuration of plug-in and hybrid electric vehicle. Journal of Central South University. 2015; 22:1737–47.
- Mohana S. A position balanced parallel particle swarm optimization method for resource allocation in cloud. Indian Journal of Science and Technology. 2015 Feb; 8(S3):182–8.
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 3.0 License.