Kumar Dubey, Nishant Pravin and Behera, Lalatendu and Kumar Rout, Ranjeet and Umer, Saiyed and Jain, Deepak Kumar and Andreu-Perez, Javier (2026) PlanetNet-MMG: A Robust Multi-Modal Graph-Based Deep Learning Model for Exoplanet Candidate Classification. Expert Systems With Applications. p. 132396. DOI https://doi.org/10.1016/j.eswa.2026.132396
Kumar Dubey, Nishant Pravin and Behera, Lalatendu and Kumar Rout, Ranjeet and Umer, Saiyed and Jain, Deepak Kumar and Andreu-Perez, Javier (2026) PlanetNet-MMG: A Robust Multi-Modal Graph-Based Deep Learning Model for Exoplanet Candidate Classification. Expert Systems With Applications. p. 132396. DOI https://doi.org/10.1016/j.eswa.2026.132396
Kumar Dubey, Nishant Pravin and Behera, Lalatendu and Kumar Rout, Ranjeet and Umer, Saiyed and Jain, Deepak Kumar and Andreu-Perez, Javier (2026) PlanetNet-MMG: A Robust Multi-Modal Graph-Based Deep Learning Model for Exoplanet Candidate Classification. Expert Systems With Applications. p. 132396. DOI https://doi.org/10.1016/j.eswa.2026.132396
Abstract
The search for exoplanets has advanced into the era of intelligent automation, yet most deep learning pipelines remain constrained to single-modality inputs or isolated views of astronomical data. We present PlanetNet-MMG, a novel multi-modal deep learning architecture that combines structured stellar metadata, raw lightcurve sequences, and graph-based relational context into a unified classification model. Our approach fuses three powerful encoders: a Tabular Transformer for domain-aware feature projection, a PatchGRU enhanced with a Vision Transformer (ViT) for learning fine-grained temporal patterns in segmented lightcurve patches, and a Graph PARE encoder that models inter-object similarity via a relational graph. Trained on a harmonized dataset derived from Kepler, TESS, and confirmed exoplanet archives, PlanetNet-MMG outperforms all state-of-the-art baselines, achieving a peak test accuracy of 90.4% and a class-averaged AUC of 0.973. Extensive experiments across 10–100 epochs and comparative evaluations against Astronet, ExoNet, OsbornNet, GCN (Lu), and ExoMiner confirm the effectiveness of our multimodal fusion. We further provide interpretability through attention overlays, t-SNE projections, and confidence histograms, reinforcing PlanetNet-MMG’s transparency and reliability for scientific discovery in astrophysics.
| Item Type: | Article |
|---|---|
| Uncontrolled Keywords: | Exoplanet Detection, Multi-Modal Learning, Tabular Transformer, PatchGRU, Vision Transformer, Graph Neural Network, Deep Learning, Lightcurve Analysis, Planetary Classification, Explainable AI |
| Subjects: | Z Bibliography. Library Science. Information Resources > ZR Rights Retention |
| Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
| SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
| Depositing User: | Unnamed user with email elements@essex.ac.uk |
| Date Deposited: | 23 Apr 2026 11:20 |
| Last Modified: | 23 Apr 2026 11:20 |
| URI: | http://repository.essex.ac.uk/id/eprint/43150 |
Available files
Filename: accepted manuscript.pdf
Licence: Creative Commons: Attribution 4.0