Next-Day Wildfire Spread Prediction Using DeepSegmentation Networks
Predicting the next-day spread of wildfires from satellite and meteorological imagery is a binary pixel-wise segmentation task characterised by one of the most severe class-imbalance conditions in applied deep learning. In the Next-Day Wildfire Spread (NDWS) dataset, fire pixels constitute only 1.20% of all valid pixels — a 82:1 no-fire-tofire ratio. We systematically compare seven deep segmentation architectures on this task: Baseline U-Net, VGG16-UNet, Attention U-Net, AdvancedFusionViT (SegFormer-based dualbranch), Mask2Former, DeepLabV3+, and a SAM-inspired hybrid encoder-decoder. All models are trained on 12 geophysical input channels at 64×64 resolution under identical conditions, with a three-stage imbalance mitigation strategy: 1:1 balanced patch sampling, Hybrid Tversky Loss (α=0.3, β=0.7), and strict invalid-pixel masking. On the test set, the Baseline U-Net achieves the best F1-Score of 0.3967 with high recall of 0.5845, while SAM achieves a validation F1 of 0.3546 at epoch 13 with an optimised threshold of 0.30. Index Terms—wildfire spread prediction, semantic segmentation, class imbalance, Tversky loss, U-Net, SegFormer, SAM, remote sensing, NDWS
Download
0 formatsNo download links available.