ESTIMATION OF FOREST BURNT AREA USING DEEP LEARNING FRAMEWORK ON OPTICAL AND SAR REMOTE SENSING IMAGES

Main Article Content

Harisha S, Suresha D

Abstract

Wildfire monitoring and burned area detection are critical to environment management and disaster response. This work presents DARU-Net (Dual-input Attention Residual U-Net), a deep learning model for the automatic identification of burned areas from multi-satellite images. The method integrates Sentinel-1 SAR images and Sentinel-2 optical images with a dual-input architecture that takes advantage of both sensors' capabilities. DARU-Net employs a U-Net backbone with residual connections and attention to achieve improved feature extraction and spatial precision. The model is trained on Sentinel-1 single-channel SAR and twelve-channel multispectral Sentinel-2 data in parallel, allowing for reliable detection under diverse atmospheric conditions. A complete preprocessing pipeline is employed to prepare the data, and an adaptive thresholding strategy enhances classification outcomes. The system has 91.81% accuracy and an F1-score of 0.9187 on validation sets, better than single-sensor solutions. Real-time analysis is assisted by a web-based interface that provides automatic burned area measurements in square kilometers. Experimental results verify that the integration of two sensors enhances detection reliability greatly, especially under cloud cover or smoke interference. The DARU-Net system provides a field-ready solution for expedient burned area measurement that helps assist emergency response agencies and environmental agencies in making informed, expedient decisions. This innovative approach improves the accuracy of land assessment and enhances situational awareness during wildfire events. By leveraging advanced sensor technology, agencies can respond more effectively to emerging threats, ultimately saving lives and protecting ecosystems.

Article Details

Section
Articles