The New York Times just published a fascinating feature revealing the story playing out in the background during the Notre-Dame Cathedral fire. It’s a compelling look behind the scenes, from the moment smoke was first detected to the moment the blaze was extinguished.

It’s also an infuriating look into the outcomes of bad design—and more crucially, the instinct to cast blame on the people who, themselves, are suffering from a design that failed them.

Casting Blame for Notre-Dame

As described by the NYT, a third-party security agency held responsibility for monitoring the smoke alarm system for the cathedral. This system used a network of more than 160 aspirating detectors—a sensitive device that sips air to detect smoke—connected to a panel responsible for reporting alerts.

According to the NYT feature, this alert interface provided critical information in the form of a “shorthanded description of [one of four zones],” along with an abbreviated code to single out a specific detector - in this case, ZDA-110-3-15-1.

The employee responsible for monitoring the system had trouble interpreting and conveying the correct information to the fire security employees within the cathedral, who then investigated the wrong location.

The result: a half-hour delay between the initial alert and the calling of the fire department.

We hear from Glenn Corbett, an associate professor of fire science from New York, who says:

“You have a system that is known for its ability to detect very small quantities of smoke,” Mr. Corbett said. “Yet the whole outcome of it is this clumsy human response. You can spend a lot to detect a fire, but it all goes down the drain when you don’t move on it.”

We also learn that there’s an ongoing battle around who’s responsible for blaze:

The miscommunication that allowed the fire to rage unchecked for so long is now the source of a bitter dispute over who is responsible.

This is a frustrating response. It casts blame on the “clumsy human response,” and faults the individuals doing their best to respond to a poorly-designed system for outcome. The headlines, the reports, and the conflicts all imply that a person bungled their job, and that it’s their personal, individual failure at fault.

The person didn’t fail. The design failed them.

And this isn’t the first time we’ve seen this happen.

Remember Hawaii?

Casting Blame for Panic in Hawaii

In 2018, Hawaiians were left terrified by an emergency alert delivered to their phones that warned of an inbound ballistic missile. They were told to seek immediate shelter—and for emphasis, told that it was not a drill.

It took more than half an hour before the correction was issued.

It took just hours before blame was assigned.

WP Hawaii Missle Alert: How one employee ’pushed the wrong button’ and caused a wave of panic

Hawaii Gov. David Ige (D) … said it had been “a mistake made during a standard procedure at the changeover of a shift and an employee pushed the wrong button.”
WP

It didn’t take long before someone leaked a photo of the interface used to deliver the alert.

This story is complicated with the revelation that the employee in question may have actually believed an attack was imminent. However, the design of the overall system, ranging from UI design, copywriting, and checks and balances failed each and every individual involved in the situation.

This isn’t a new problem. We’ve been casting blame on people that we’ve set up to fail for a long time. But human performance is largely an outcome of design, and poor design results in poor performance.

One of my favourite stories around human-centered design calls all the way back to World War II, at the start of serious consideration around human factors in design.

Casting Blame for Pilot Error

During World War II, researchers were called in to investigate a rise in what were being called “pilot errors.” The assumption of the time was that “humans could be trained to fit a design” (sound familiar?), and this drove the changes engineers made as they redesigned airplane cockpits.

Photo from WikiMedia Commons

As the story goes, the engineers were indignant about their changes, even as cases of pilot error rolled in. They said, “Pilots are very intelligent, highly trained, and had already shown they could adapt to [the] changes.”

However, they’d swapped the location of the throttle and ejection handles, and while under the stress of being shot at while flying at high altitudes, the pilots were often reverting to earlier, automatic behaviour and inadvertently ejecting themselves from the planes [1].

Photo from Wikimedia Commons

The pilots didn’t fail. The design failed the pilots.

Design is Both a Process and an Outcome

Let me be clear: a poor design outcome—like Hawaii’s missle alert interface—is distinct from the process of design. Much like we shouldn’t blame people for the outcomes of their interaction with poor design, we must remember that poor design outcomes are rarely the fault of any one person.

Design is a process that involves many people making many decisions—or non-decisions—in a broad context with many moving parts, implications, obligations, and most of all, humans.

People will get things wrong during design and as a result of design.

By the very nature of people designing for people, things will go wrong.

Instead of finding people to blame, find ways to improve design.


[1]: This story comes from the remarkable “The UX Book” by By Rex Hartson and Pardha S. Pyla, from their introduction on p. 39. I’ve paraphased their story here.