MEMIS: Multimodal Emergency Management Information System

Published in The Forty-Second European Conference on Information Retrieval., 2020

Recommended citation: Mansi Agarwal*, Maitree Leekha*, Ramit Sawhney, Rajiv Ratn Shah, Rajesh Yadav, Dinesh Vishwakarma. The Forty-Second European Conference on Information Retrieval. ECIR 2020.'

[PDF] [DOI]

Abstract

The recent upsurge in the usage of social media and the multimedia data generated therein has attracted many researchers for analyzing and decoding the information to automate decision-making in several fields. This work focuses on one such application: disaster management in times of crises and calamities. The existing research on disaster damage analysis has primarily taken only unimodal information in the form of text or image into account. These unimodal systems, although useful, fail to model the relationship between the various modalities. Different modalities often present supporting facts about the task, and therefore, learning them together can enhance performance. We present MEMIS, a system that can be used in emergencies like disasters to identify and analyze the damage indicated by user-generated multimodal social media posts, thereby helping the disaster management groups in making informed decisions. Our leave-one-disaster-out experiments on a multimodal dataset suggest that not only does fusing information in different media forms improves performance, but that our system can also generalize well to new disaster categories. Further qualitative analysis reveals that the system is responsive and computationally efficient.