Global Assessment Report on Disaster Risk Reduction 2022, entitled "Our World at Risk: Transforming Governance for a Resilient Future" published by United Nations Office for Disaster Risk Reduction (UNDRR) explained how human biases and decision processes affect risk reduction outcomes.
The report mentioned, in most daily situations, people rely on quick short cuts (heuristics) to allow mostly accurate decisions, rather than on a deep and full assessment of the relative costs and benefits of each decision.
Research into decisionmaking has concluded this occurs for a variety of reasons relating to the basic architecture of human minds and the large amount of information processed every waking minute.
Habits of mind become biases that interact with people’s social motives and the world around them to determine the
decisions they make.
This also affects the decisions made individually and collectively about how to cope with disasters. This chapter offers insights into why human minds form habits that are resistant to change, how these cognitive biases can result in suboptimal decision-making around disasters and also how understanding this can be harnessed to accelerate effective risk reduction.
Heuristics-based decision-making is one of these two modes of thinking. This “intuitive thinking” approach is fast and relatively low effort in terms of the amount of mental attention it requires, and is also termed “thinking fast” (Kahneman,2013).
Humans tend to use this approach to make decisions in situations that either require relatively little attention or that are complex and rapidly evolving. When presented with the need to make rapid decisions, especially in conditions where there are multiple issues competing for their attention, heuristic-based decisions allow people to make a decision and move on relatively quickly.
This is significant for DRR because when sudden-onset disasters occur, according to the report, there is a need for rapid decisions under situations of incomplete information with many issues competing for attention – conditions in which intuitive thinking is the typical approach to decision-making.
Experts also use these mental short cuts, as shown in a study of decision-making in humanitarian disaster
response that showed intuitive, heuristic-based decisions were the dominant approach to decisions in disaster response (Comes, 2016).
In contrast, decision-making to prevent the development of new risk, to reduce known risk outside the context
of an immediate crisis, and to perceive and address systemic risk requires deliberative thinking, or “thinking slow”.
Heuristics may also be tuned to optimize perceptions of cost and benefit in a person’s local environment.
They provide quick answers to common problems and have developed precisely because they work well in most situations.
Heuristics respond to specific and immediate environmental cues. They focus attention and decisions on imminent crises, but they mean that slower-moving risks, frequent low-impact disasters or crises with long lead times, and their systemic impacts, can easily be overlooked by intuitive thinking (Broomell, 2020).
While, in general, any individual person can be successful in operating according to deep or engaged decision-making, on aggregate, “thinking fast” represents the most common way that people engage with decisions.
However, these heuristics introduce identifiable biases that do not always result in good decisions, especially
when the situation is complex or high pressured. Biases, or heuristics, that can emerge and which are particularly relevant in disaster decisionmaking include:
● Myopia and simplification, or the tendency to simplify complex problems and make decisions based on limited and personally relevant information.
● The tendency to overemphasize information that is more easily remembered or made salient by a specific environment.
● Anchoring, or using an irrelevant number as the basis for decision under conditions of great uncertainty.
● Optimism and overconfidence, or a general tendency for people to see situations as less threatening than they are and to see themselves as more capable than they are.
● The status quo bias and loss aversion, or the tendency to accept existing situations (even if
negative) and to be concerned more about the risk of loss than the potential gain.
Not all decisions are made by heuristics. The second process of decision-making, “deliberative thinking”, involves a conscious consideration of the different benefits and risks of different possible choices.
Such rational decision-making is exceptionally powerful and is at the core of humans’ evolutionary success – but it is also effortful in time and attention, and is something people do not always do.
Some theories suggest people do it only if they feel the automatic response needs to be double checked or corrected. People are more likely to use deliberative models when aware that the decisions are highly important, when they have time to make a decision and when they feel they have sufficient information to make a good decision.
In practice, this means people are more likely to take problems seriously and engage with the need for DRR when those problems are consequential, made salient or active by the environment, when they threaten direct and personal loss, and when they affect individuals directly.
An example of this comes from risk reduction decisions around volcanic activity. Some volcanic eruptions easily meet the criteria above: they are characterized by visible indicators of danger or rapidly evolving situations that focus attention, loud noises, or other elements that drive salience, loss aversion and other heuristics to encourage people to pay attention – and react – to imminent risk of disaster.
In contrast, other types of volcanic activity have fewer of these elements but are equally dangerous. An assessment of the social dynamics of volcanic risk found successful communication was facilitated in part by the consistent transmission of specific risk information, particularly in locally relevant languages and by locally trusted representatives (Barclay et al., 2008).
When the risk was seen as a slower developing risk over a longer term, or was less clear or politically polarized – as in volcanic dangers in Guadeloupe, Montserrat and Tenerife – at-risk populations were much less likely to engage effectively in DRR.
Therefore, the challenges for governments are how to promote good decisions and how to create systems to expose risky cognitive biases to incentivize those good decisions instead.
Awareness is not enough
Research into decision-making has found awareness of risk is not enough to drive behaviour change. In fact, people regularly fail to reduce their personal risk even when they know in the abstract that such risk is real.
This is because risk decisionmaking is a process (Ajzen, 2020). Biases and motivated reasoning can influence the decision and its execution at each step – from awareness of risk, to understanding options, to confidence that such
options can be executed, to selection of a course of action, to execution of that action.
One aspect of the challenge in promoting effective risk reduction relates to the availability of accurate
information about risk.
Forecasts may be accurate but uncertain, so governance systems and decision makers must accept a certain tolerance for uncertainty in decision-making, to manage systemic risk.
However, people are more likely to engage in risk reduction behaviour when they are aware of a risk, feel confident they have specific knowledge about what to do to reduce the risk and have the agency to act.