Eine Plattform für die Wissenschaft: Bauingenieurwesen, Architektur und Urbanistik
Transfer Learning with Convolutional Neural Networks for Rainfall Detection in Single Images
Near real-time rainfall monitoring at local scale is essential for urban flood risk mitigation. Previous research on precipitation visual effects supports the idea of vision-based rain sensors, but tends to be device-specific. We aimed to use different available photographing devices to develop a dense network of low-cost sensors. Using Transfer Learning with a Convolutional Neural Network, the rainfall detection was performed on single images taken in heterogeneous conditions by static or moving cameras without adjusted parameters. The chosen images encompass unconstrained verisimilar settings of the sources: Image2Weather dataset, dash-cams in the Tokyo Metropolitan area and experiments in the NIED Large-scale Rainfall Simulator. The model reached a test accuracy of 85.28% and an F1 score of 0.86. The applicability to real-world scenarios was proven with the experimentation with a pre-existing surveillance camera in Matera (Italy), obtaining an accuracy of 85.13% and an F1 score of 0.85. This model can be easily integrated into warning systems to automatically monitor the onset and end of rain-related events, exploiting pre-existing devices with a parsimonious use of economic and computational resources. The limitation is intrinsic to the outputs (detection without measurement). Future work concerns the development of a CNN based on the proposed methodology to quantify the precipitation intensity.
Transfer Learning with Convolutional Neural Networks for Rainfall Detection in Single Images
Near real-time rainfall monitoring at local scale is essential for urban flood risk mitigation. Previous research on precipitation visual effects supports the idea of vision-based rain sensors, but tends to be device-specific. We aimed to use different available photographing devices to develop a dense network of low-cost sensors. Using Transfer Learning with a Convolutional Neural Network, the rainfall detection was performed on single images taken in heterogeneous conditions by static or moving cameras without adjusted parameters. The chosen images encompass unconstrained verisimilar settings of the sources: Image2Weather dataset, dash-cams in the Tokyo Metropolitan area and experiments in the NIED Large-scale Rainfall Simulator. The model reached a test accuracy of 85.28% and an F1 score of 0.86. The applicability to real-world scenarios was proven with the experimentation with a pre-existing surveillance camera in Matera (Italy), obtaining an accuracy of 85.13% and an F1 score of 0.85. This model can be easily integrated into warning systems to automatically monitor the onset and end of rain-related events, exploiting pre-existing devices with a parsimonious use of economic and computational resources. The limitation is intrinsic to the outputs (detection without measurement). Future work concerns the development of a CNN based on the proposed methodology to quantify the precipitation intensity.
Transfer Learning with Convolutional Neural Networks for Rainfall Detection in Single Images
Nicla Maria Notarangelo (Autor:in) / Kohin Hirano (Autor:in) / Raffaele Albano (Autor:in) / Aurelia Sole (Autor:in)
2021
Aufsatz (Zeitschrift)
Elektronische Ressource
Unbekannt
Metadata by DOAJ is licensed under CC BY-SA 1.0
Transfer learning and deep convolutional neural networks for safety guardrail detection in 2D images
British Library Online Contents | 2018
|Transfer learning and deep convolutional neural networks for safety guardrail detection in 2D images
British Library Online Contents | 2018
|Vision-based defects detection for bridges using transfer learning and convolutional neural networks
Taylor & Francis Verlag | 2020
|