Khan, Zardad and Gul, Naz and Faiz, Nosheen and Asma, Gul and Adler, Werner and Lausen, Berthold (2021) Optimal trees selection for classification via out-of-bag assessment and sub-bagging. IEEE Access, 9. pp. 28591-28607. DOI https://doi.org/10.1109/ACCESS.2021.3055992
Khan, Zardad and Gul, Naz and Faiz, Nosheen and Asma, Gul and Adler, Werner and Lausen, Berthold (2021) Optimal trees selection for classification via out-of-bag assessment and sub-bagging. IEEE Access, 9. pp. 28591-28607. DOI https://doi.org/10.1109/ACCESS.2021.3055992
Khan, Zardad and Gul, Naz and Faiz, Nosheen and Asma, Gul and Adler, Werner and Lausen, Berthold (2021) Optimal trees selection for classification via out-of-bag assessment and sub-bagging. IEEE Access, 9. pp. 28591-28607. DOI https://doi.org/10.1109/ACCESS.2021.3055992
Abstract
The effect of training data size on machine learning methods has been well investigated over the past two decades. The predictive performance of tree based machine learning methods, in general, improves with a decreasing rate as the size of training data increases. We investigate this in optimal trees ensemble (OTE) where the method fails to learn from some of the training observations due to internal validation. Modified tree selection methods are thus proposed for OTE to cater for the loss of training observations in internal validation. In the first method, corresponding out-of-bag (OOB) observations are used in both individual and collective performance assessment for each tree. Trees are ranked based on their individual performance on the OOB observations. A certain number of top ranked trees is selected and starting from the most accurate tree, subsequent trees are added one by one and their impact is recorded by using the OOB observations left out from the bootstrap sample taken for the tree being added. A tree is selected if it improves predictive accuracy of the ensemble. In the second approach, trees are grown on random subsets, taken without replacement-known as sub-bagging, of the training data instead of bootstrap samples (taken with replacement). The remaining observations from each sample are used in both individual and collective assessments for each corresponding tree similar to the first method. Analysis on 21 benchmark datasets and simulations studies show improved performance of the modified methods in comparison to OTE and other state-of-the-art methods.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Tree selection, Classification, Ensemble learning, Out-of-bag sample, Random forest, Sub-bagging |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Mathematics, Statistics and Actuarial Science, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 29 Jan 2021 08:44 |
Last Modified: | 30 Oct 2024 17:16 |
URI: | http://repository.essex.ac.uk/id/eprint/29665 |
Available files
Filename: 09343298.pdf
Licence: Creative Commons: Attribution 3.0