Abstract
Data mining applications have the potential to address current deficiencies in the provision of humanitarian aid in natural disasters. Simultaneous text and image analysis in crowd-sourced data can improve the quality of humanitarian aid information. Specifically, we select Bidirectional Encoder Representations from Transformers (BERT) and its descendant ALBERT as pre-trained deep networks for the text modality, while we choose ConvNeXt, RegNet, and Faster RCNN for the image modality. The developed framework demonstrates its application in classifying humanitarian aid through three key aspects. Firstly, it illustrates the effective performance of ConvNeXt and BERT in the classification of humanitarian aid. Secondly, it investigates the efficiency of generative adversarial networks (GAN) in generating synthetic images for imbalanced input datasets. This approach improves the accuracy, precision, recall, and F1-score of the framework when applied to unseen test data. Finally, the study highlights the potential use of SHapley Additive exPlanations (SHAP) for interpreting the behaviour of the developed framework, supporting the timely classification of humanitarian aid information from crowd-sourced data after natural disasters.
Original language | English |
---|---|
Article number | 103972 |
Pages (from-to) | 1-15 |
Number of pages | 15 |
Journal | International Journal of Disaster Risk Reduction |
Volume | 96 |
DOIs | |
Publication status | Published - 1 Oct 2023 |
Bibliographical note
Funding Information:The authors are grateful for support from the Australian Research Council (ARC) through the Linkage project ( LP180101080 ). The authors acknowledge the contributions of the members of ASCII Lab at Monash University for critiquing the manuscript and providing constructive feedback.
Funding Information:
The authors are grateful for support from the Australian Research Council (ARC) through the Linkage project (LP180101080). The authors acknowledge the contributions of the members of ASCII Lab at Monash University for critiquing the manuscript and providing constructive feedback.
Publisher Copyright:
© 2023 The Authors