Abstract
Remote sensing has become a key tool for monitoring Earth’s surface over time, offering valuable insights into both natural and human-driven changes. Among its many applications, change detection focuses on analyzing multi-temporal imagery to reveal how specific areas evolve across different time periods. It plays a pivotal role in Earth observation applications, including urban development monitoring, environmental degradation assessment, and disaster response. However, existing approaches often struggle with limited contextual awareness, high sensitivity to noise, and imprecise localization of change boundaries, especially with high-resolution imagery. This thesis investigates the complex problem of change detection in remote sensing imagery by proposing two novel deep learning frameworks: transformer change detection network (TrCD- Net) and triple attention network (TANet), each designed to capture nuanced transformations across multi-temporal datasets through distinct modeling strategies. TrCD-Net utilizes a transformer encoder to model long-range spatial dependencies and ensure consistency across temporal inputs, followed by a unified decoder with a semantic integration module that promotes spatial precision and refined feature fusion. TANet incorporates a hybrid architecture that integrates convolutional operations with a multi-representational attention strategy, including specialized mechanisms in both spatial and frequency domains. These modules enhance the model’s ability to prioritize discriminative features, suppress noise, and accurately localize changes. This research highlights architectural innovation, enhancing change detection performance and robustness to natural variations such as seasonal and illumination changes. To support this aim, the models are evaluated across diverse land cover types and sensing platforms to ensure generalization and reliability. Results show that both TrCD-Net and TANet outperform existing state-of-the-art methods. Visualizations confirm improved spatial precision and clarity. Comprehensive ablation studies validate the design choices, confirming each component contributes meaningfully. The frameworks show strong potential for applications requiring accurate, fine-grained change detection. Future work may explore multi-modal data fusion and lightweight architectures for resource-constrained environments.
School
School of Sciences and Engineering
Department
Computer Science & Engineering Department
Degree Name
PhD in Applied Sciences
Graduation Date
Fall 2025
Submission Date
9-17-2025
First Advisor
Ahmed Rafea
Second Advisor
Seif Eldawlatly
Committee Member 1
Moustafa Youssef
Committee Member 2
Hossam Sharara
Extent
167 p.
Document Type
Doctoral Dissertation
Institutional Review Board (IRB) Approval
Not necessary for this item
Disclosure of AI Use
No use of AI
Recommended Citation
APA Citation
Badawy, H.
(2025).Deep Learning-Based Change Detection in High-Resolution Remote Sensing Imagery [Doctoral Dissertation, the American University in Cairo]. AUC Knowledge Fountain.
https://fount.aucegypt.edu/etds/2588
MLA Citation
Badawy, Hazem. Deep Learning-Based Change Detection in High-Resolution Remote Sensing Imagery. 2025. American University in Cairo, Doctoral Dissertation. AUC Knowledge Fountain.
https://fount.aucegypt.edu/etds/2588
