Forthright Irma: A discussion of Hurricane Ambiguity.

It’s hard to get people to evacuate. Cara Cuite and Rebecca Morss–risk communication and hurricane experts–write about several factors that can cause people to ignore evacuation warnings. Things like: some people don’t like being told what to do; sometimes they judge fear-based messaging as “overblown” and disregard it; sometimes the cost and logistical nightmare of evacuating causes them to prefer to shelter-in-place. (Do read their article; it’s so interesting.)

But my favorite factor which causes people to ignore evacuation warnings is ambiguity. Ambiguity is systemic and unavoidable and–worse–humans are terrible at managing it. Some individuals and cultures are better at tolerating uncertainty than others (as Hofstede points out), but generally humans don’t like to take action when they can’t predict the outcome.

This trait can influence Emergency Managers’ work in two ways: 1. Storm prediction is inherently ambiguous which makes our jobs harder and 2. Ambiguity from authorities causes people to hesitate putting them in danger.

1. Storm prediction is ambiguous

Below is a comparison of American (blue) and European (red) computer models predicting the path of Hurricane Irma. The darkest lines are the averages. (Thanks to the Washington Post for this picture and many other excellent ones.) As you can see, there is a limit to how finely science can predict a hurricane’s progress. Imagine you’re the governor of Florida. Do you evacuate Jacksonville?

Irma prediction model
Group of simulations from American (blue) and European (red) computer models from Friday night [9/8/17]. Each color strand represents a different model simulation with slight tweaks to initial conditions. Note that the strands are clustered together where the forecast track is most confident but they diverge where the course of the storm is less certain. The bold red line is the average of all of the European model simulations, while the blue is the average of all the American model simulations.(StormVistaWxModels.com)
Fortunately for the real Governor of Florida, storm prediction has vastly improved since the deadliest storm in American history–“The Galveston Hurricane of 1900″ (This was before they started naming storms). During the 1900s, American meteorologists had a poor understanding of how storms played in the ocean. Though the more experienced Cuban meteorologists warned of an incoming hurricane, the message was ignored and no one evacuated. Surging waters killed 8,000 of the 37,789 residents or about 20% of the population.

After World War II, “the U.S. still used pretty simple forecasting tools. Airplanes took rough rides into these tempests, found the storm’s center, and then returned every six hours to find the center once again,” reports Popular Science. The U.S. launched it’s first weather satellite in 1960 and the first satellite images were broadcast on television in the 1970s.

The last decade or so has seen even greater improvements of predictions through better satellite technology and computer modeling. The Natural Hazards Review estimates that weather satellites have prevented up to 90% of the deaths that would have occurred had meteorologists not had satellites available. NOAA reports that their errors in storm tracking has dropped by 50% in the last 15 years while in the last 5 years, NOAA has improved it’s notice-giving by 12 hours. Public officials now have 36 hours of advance notice. If it hadn’t been for these improvements, weather experts estimate 10,000-20,000 people killed in Hurricane Katrina, instead of the actual 1,200 people. Because of storm tracking, only 15% of New Orleans’ population was still in the city.

The bad news is that there is still ambiguity to storm tracking–for instance, scientists still have a hard time judging the intensity of a storm. The good news is, the ambiguity is way less than it was before.

2. Ambiguity from authorities can cause inaction.

The ambiguity from storm prediction can creep into the language used by public leaders which directly causes people to hesitate to take action or to disregard warnings. Studies show that people use multiple sources of information when trying to make a decision and that people are more likely to take the action when a) sources agree and b) information is consistent over time.

Let’s compare the evacuation orders from Hurricane Harvey and Hurricane Irma. Evacuation messaging for Hurricane Irma was consistent and forceful and Florida evacuated smoothly. On the other hand, Texas officials have been criticized by some for their weak and inconsistent evacuation directions.

Before Hurricane Harvey, Houston Mayor Sylvester Turner told residents to shelter-in-place. Meanwhile, Texas Governor Greg Abbot said, “If you have the ability to evacuate and go someplace else for a little while, that would be good.” This mismatch in message caused many residents to stay put. In Mayor Turner’s defense, he was expecting flooding instead of high winds and driving in flooded streets is far more dangerous than staying in your house. The two public leaders judged the ambiguous weather data differently from their different vantage points.

ap_17239711478977_wide-911970ea2b58f92f1b30a213324a490678e43c90-s800-c85
Pic courtesy of NPR

Additionally, Gov. Abbot’s “evacuation order” seems weak. The “If you can…that would be good,” sounds like a suggestion on par with “If you could get me butter at the store, that would be good.” At first, I was frustrated because I assumed Gov. Abbot was just a bad public speaker. “Do you want people to evacuate or not?!” I yelled at the TV. (Please forgive me, Mr. Governor.) But after reflection, I think his message was ambiguous because it had to be. Here are the facts I imagine are in Abbott’s mind: 1. I want you to evacuate. 2. Evacuation causes traffic jams. We all remember the horror of the 2007 evacuation from Hurricane Rita–the largest evacuation on record. 3. Smart people are telling me that this could just be rain, in which case I don’t want millions of people flooded and drowning on the highways. 4. If I explicitly call for voluntary evacuation, people might evacuate from safe areas blocking the road for people trying to evacuate from dangerous areas. Poor Gov. Abbot. Not only is there ambiguity arising from the limitations of science and from different vantage points, but there is ambiguity in messaging because of conflicting motivations.

Fortunately, evacuations for Hurricane Irma went smoothly. We could make the argument, as Alan Bernstein, spokesperson for Houston Mayor Turner does, that this was due to Irma’s certainty. He said to NPR, “Irma is totally different. It is forecast for a direct hit on populous areas, bringing highly destructive winds and perhaps heavy coastal destruction. That was not the case here, and Mayor Turner would not second-guess an evacuation order for Florida.”

 

All I can say is: Thank God for better storm tracking.

Further Reading

Advertisements

Humans and Information: Managing a herd of cats.

I was talking to a project manager the other day about our common struggle: information systems management. We argued the various pros and cons of software designed to organize teams and information (’cause we’re nerds). As our conversation progressed over issues of human error and information tagging, I got to thinking about the similarities of maintaining knowledge and maintaining groups.

Group dynamics and information sharing are inextricably linked in a number of ways. On top of that–or maybe because of that–maintaining groups and information systems takes a lot of the same resources and methodology. Here are some links between the two:

Link #1: Group relationships (i.e. ‘cohesion’) effect how information is shared.

Rupert Brown says in Group Processes (2000) that the more tightly bonded group members feel to one another, the more likely they are to share information–whether it’s useful or not. However, tightly cohesive groups tend to be more isolated from the environment so their information can become repetitive or outdated. On the other hand, loosely associated groups tend not to share all information with everyone, but the information they do have tends to be unique and up to date because of the members’ contact with non-group members. Therefore, managing the groups dynamics directly influences the play of information throughout the group.

Side note: On the flip side of the coin, we might be able to say that information directly influences group relationships. If young adult drama-dies have taught me anything, it’s that a rumor can make or break a friendship.

Link #2: More people effect group maintenance. More information effects system maintenance.

Network theory (and common sense) states that the more nodes a network has, the more complex the network is. Just think about how your family’s dynamics changed after your sister married that guy. Adding a node (brother-in-law) to Thanksgiving dinner changed the dynamic of the network (family). Moreover, larger networks with more complex arrangements require more resources in order to maintain them. For information systems, this might mean more people or more time spent on data entry. For groups, it might mean more time and energy spent on group-building activities, managing rumors, or resolving conflict, etc.

Side note: I suspect that the larger and more complex the network, the more inertia it develops which is why very large groups (like governments) take so long to respond to changes in the environment (like a world-wide economic downturn).

Link #3: Information systems management is necessarily human systems management.

Collaboration not only requires that we share information, but that we share information in a way we can all recognize, access, and manipulate. Each team member must be trained in a standardized method for handling group information. We all have to use the same file-naming system, the same date system, the same tags. Furthermore, as we alluded in Link 2: a) the more types of information there is to be handled, the more complex the system for managing it becomes and b) the more people handling the information, the more complex the system for managing it becomes. Information management is directly dependent on group management.

This link seems to be where project managers and emergency managers spend so much of their time. It seems like we’re all struggling to get everyone else to manage information correctly.

Link #4: The person who manages group life also tends to manage information flow. 

It’s easier to spot in small, informal groups, but in every group there is a gatekeeper. A gatekeeper is a person who manages access to benefits which they do not own. For instance, access to the boss, a spot on the agenda, or access to illicit information (like rumors or secrets which they may trade for more political capital).

I, personally, like to think of the gatekeeper as someone who manages the group’s Transactive Memory System (TMS) which is a fancy sociological term for knowing who knows what. Usually, the gatekeeper is well connected in the group and–especially if they’ve been there a long time–generally knows who knows what. They are a valuable resource for members as they can direct them to those members with the expertise or connections they need. Gatekeepers control information flow in the group in a very direct way.

Here’s where I go out on a theoretical limb. We’ve experienced often how information gets stymied during a crisis and a lot of research (including my own) is focused around how to open the channels of communication. What if part of the problem of information flow is that the gatekeeper gets overwhelmed by requests literally or figuratively? What if they don’t prioritize information correctly and let something important drop? What if they can’t help the group efficiently because there’s too much noise (i.e. the opposite of data) to process requests for information/expertise?

So the solution might be to decentralize gatekeeping duties across the whole group (or at least more members), but then you run into the increased maintenance cost of more people handling information.  Hmm….

Conclusion: 

There seems to be an intimate association between handling group members and handling information. They take a lot of the same resources and processes and the cost in terms of time and effort of maintaining each seemed to be linked. Each individual may have a limit to how much information they can process at one time, necessitating collaboration with other individuals. But collaboration is inherently more costly as networks increase in complexity. Therefore, managers must cleverly walk the delicate line between too little and too much information, between too few and too many team members.

So if you ever feel like the system should behave better than it is, keep in mind: you’re actually managing two related, yet unique systems. Problem in one might be symptoms of problems in the other.