Bulut, Faruk and Dönmez, İknur (2026) Meta‐Learning Analysis of Deep Neural Network Architectures on Diverse Numeric Datasets via Geometric Complexity Descriptors. International Journal of Intelligent Systems, 2026 (1). DOI https://doi.org/10.1155/int/8573962
Bulut, Faruk and Dönmez, İknur (2026) Meta‐Learning Analysis of Deep Neural Network Architectures on Diverse Numeric Datasets via Geometric Complexity Descriptors. International Journal of Intelligent Systems, 2026 (1). DOI https://doi.org/10.1155/int/8573962
Bulut, Faruk and Dönmez, İknur (2026) Meta‐Learning Analysis of Deep Neural Network Architectures on Diverse Numeric Datasets via Geometric Complexity Descriptors. International Journal of Intelligent Systems, 2026 (1). DOI https://doi.org/10.1155/int/8573962
Abstract
Meta-learning techniques aim to predict the most suitable learning algorithm for a given dataset based on its intrinsic structural characteristics. These techniques provide a robust framework for understanding algorithmic behavior across diverse data distributions and attributes. Although these state-of-the-art models (CNNs and transformers) are widely applied in various machine learning tasks, their use on numerical datasets remains underexplored due to the complexity of their internal structures. This study aims not only to predict the performance of two black-box deep learning models on static datasets but also to conduct a behavioral analysis in order to identify which meta-features most strongly influence their outcomes. It seems unclear which specific attributes of a dataset positively or negatively affect the performance of these deep learning models. To bridge this gap, we constructed a meta-dataset consisting of 296 datasets, each characterized by 20 meta-features describing the dataset’s statistical, geometric, and structural properties. The analysis identifies which intrinsic dataset properties influence model accuracy, without relying on raw data or hyperparameter tuning. Results show that both models perform best on datasets with high feature discriminability, as captured by meta-features such as maximum feature efficiency, collective feature efficiency, and directional separability. In contrast, performance declines with increasing class boundary complexity and nonlinearity, reflected in features like class separability measures and the linear classifier nonlinearity metric. While CNNs are more sensitive to local geometric complexity, transformers respond more strongly to global statistical measures such as mutual information and entropy, highlighting their distinct inductive biases. The proposed meta-model accurately predicts the performance of both architectures on unseen datasets (0.96 correlation coefficient, 0.019 MAE, and 0.025 RMSE for CNNs; 0.92 correlation coefficient, 0.027 MAE, and 0.036 RMSE for transformers), enabling performance estimation without costly training. These findings emphasize the importance of aligning model architecture with dataset geometry and structure. Additionally, the framework supports more interpretable, efficient, and sustainable deep learning model selection in structured data settings.
| Item Type: | Article |
|---|---|
| Uncontrolled Keywords: | accuracy prediction; CNN; complexity measures; dataset geometry; meta-attributes; model selection; transformer |
| Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
| SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
| Depositing User: | Unnamed user with email elements@essex.ac.uk |
| Date Deposited: | 29 Apr 2026 14:18 |
| Last Modified: | 29 Apr 2026 14:18 |
| URI: | http://repository.essex.ac.uk/id/eprint/43195 |
Available files
Filename: Bulut Meta Learning Analysis of Deep Neural Network Architectures.pdf
Licence: Creative Commons: Attribution 4.0