Automated Detection of Dust Storms from Ground-Based Weather Station Imagery Using Neural Network Classification

Abstract

 

Dust storms are a severe weather phenomenon that can reduce visibility on roadways resulting in hazardous driving conditions for commercial and residential drivers. Efforts to reduce these numbers could involve an early-warning dust forecast system. One initial step in such an effort is processing video footage from weather-station based cameras to classify dust events. This project demonstrates a methodology to classify dust storms from continuous (24/7) weather station video footage. Five cameras from the Animas Playa, in western New Mexico, USA recorded 1.8TB of video over multiple years. Video footage from 2015 recorded 77 large dust storms. This video footage was manually labeled and used to train, test, and validate a feed-forward-neural-network to autonomously classify dust storms. Imagery in both Red-Green-Blue (RGB) and Hue-Saturation-Value (HSV) color spaces were tested. Results indicate an accuracy of 96.87%, a recall of 99.83%, and a precision of 94.24% for dust storm classification utilizing a combination of hue, saturation, and value bands. Only 112 frames out of 3574 testing frames were misclassified.