Khan, Zardad and Gul, Asma and Perperoglou, Aris and Miftahuddin, Miftahuddin and Mahmoud, Osama and Adler, Werner and Lausen, Berthold (2020) Ensemble of Optimal Trees, Random Forest and Random Projection Ensemble Classification. Advances in Data Analysis and Classification, 14 (1). pp. 97-116. DOI https://doi.org/10.1007/s11634-019-00364-9
Khan, Zardad and Gul, Asma and Perperoglou, Aris and Miftahuddin, Miftahuddin and Mahmoud, Osama and Adler, Werner and Lausen, Berthold (2020) Ensemble of Optimal Trees, Random Forest and Random Projection Ensemble Classification. Advances in Data Analysis and Classification, 14 (1). pp. 97-116. DOI https://doi.org/10.1007/s11634-019-00364-9
Khan, Zardad and Gul, Asma and Perperoglou, Aris and Miftahuddin, Miftahuddin and Mahmoud, Osama and Adler, Werner and Lausen, Berthold (2020) Ensemble of Optimal Trees, Random Forest and Random Projection Ensemble Classification. Advances in Data Analysis and Classification, 14 (1). pp. 97-116. DOI https://doi.org/10.1007/s11634-019-00364-9
Abstract
The predictive performance of a random forest ensemble is highly associated with the strength of individual trees and their diversity. Ensemble of a small number of accurate and diverse trees, if prediction accuracy is not compromised, will also reduce computational burden. We investigate the idea of integrating trees that are accurate and diverse. For this purpose, we utilize out-of-bag observations as a validation sample from the training bootstrap samples, to choose the best trees based on their individual performance and then assess these trees for diversity using the Brier score on an independent validation sample. Starting from the first best tree, a tree is selected for the final ensemble if its addition to the forest reduces error of the trees that have already been added. Our approach does not use an implicit dimension reduction for each tree as random project ensemble classification. A total of 35 bench mark problems on classification and regression are used to assess the performance of the proposed method and compare it with random forest, random projection ensemble, node harvest, support vector machine, kNN and classification and regression tree (CART). We compute unexplained variances or classification error rates for all the methods on the corresponding data sets. Our experiments reveal that the size of the ensemble is reduced significantly and better results are obtained in most of the cases. Results of a simulation study are also given where four tree style scenarios are considered to generate data sets with several structures.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Ensemble classification; Ensemble regression; Random forest; Random projection ensemble classification; Accuracy and diversity |
Subjects: | Q Science > QA Mathematics |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Mathematics, Statistics and Actuarial Science, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 06 Jun 2019 14:15 |
Last Modified: | 30 Oct 2024 16:17 |
URI: | http://repository.essex.ac.uk/id/eprint/21533 |
Available files
Filename: Khan2019_Article_EnsembleOfOptimalTreesRandomFo.pdf
Licence: Creative Commons: Attribution 3.0