AI for crisis decisions

Tina Comes*

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

11 Downloads (Pure)


Increasingly, our cities are confronted with crises. Fuelled by climate change and a loss of biodiversity, increasing inequalities and fragmentation, challenges range from social unrest and outbursts of violence to heatwaves, torrential rainfall, or epidemics. As crises require rapid interventions that overwhelm human decision-making capacity, AI has been portrayed as a potential avenue to support or even automate decision-making. In this paper, I analyse the specific challenges of AI in urban crisis management as an example and test case for many super wicked decision problems. These super wicked problems are characterised by a coincidence of great complexity and urgency. I will argue that from this combination, specific challenges arise that are only partially covered in the current guidelines and standards around trustworthy or human-centered AI. By following a decision-centric perspective, I argue that to solve urgent crisis problems, the context, capacities, and networks need to be addressed. AI for crisis response needs to follow dedicated design principles that ensure (i) human control in complex social networks, where many humans interact with AI; (ii) principled design that considers core principles of crisis response such as solidarity and humanity; (iii) designing for the most vulnerable. As such this paper is meant to inspire researchers, AI developers and practitioners in the space of AI for (urban) crisis response – and other urgent and complex problems that urban planners are confronted with.

Original languageEnglish
Article number12
JournalEthics and Information Technology
Issue number1
Publication statusPublished - 2024


  • Crisis management
  • Decision theory
  • Human-AI interaction
  • Human-centred AI
  • Responsible AI


Dive into the research topics of 'AI for crisis decisions'. Together they form a unique fingerprint.

Cite this