Commentary

The Communications Failures Lessons of Three Mile Island

Just about all the experts agree that Three Mile Island (TMI) was not a serious accident. That doesn’t mean it wasn’t a serious screw-up. Things went wrong that should never go wrong. When they pumped the accident conditions through the Babcock and Wilcox (B&W) simulation of the TMI plant, they got a total core meltdown and a genuine catastrophe; fortunately, reality was more conservative than the B&W simulation. It’s a little like a drunk successfully crossing a highway blindfolded. In human health terms, nothing much happened at TMI, but awful things almost happened.

TMI was by no means the only near miss in the history of nuclear power. (The frequency of near misses and the infrequency of real disasters—Chernobyl being the only one we know about for sure—signifies that either nuclear power is an intolerably dangerous technology and we’re living on borrowed time, or that “defense in depth” works and a miss is as good as a mile.) But TMI was the only near miss that captivated public attention for weeks, is widely misremembered as a public health catastrophe, is still a potent symbol of nuclear risks, and, as a result, has had devastating repercussions for the nuclear power industry.

Pay Attention to Communications

What went wrong at TMI—really, really wrong? Communication. Communication professionals were minor players at TMI.

I asked Jack Herbein, the Metropolitan Edison (MetEd) engineering vice president who managed the accident, why he so consistently ignored the advice of his public relations (PR) specialist, Blaine Fabian. (Risk communication hadn’t been invented yet.) He told me, “PR isn’t a real field. It’s not like engineering. Anyone can do it.” That attitude, I think, cost MetEd and the nuclear power industry dearly. And that attitude continues to dominate the nuclear industry, contributing to one communication gaffe after another. Nuclear power proponents keep shooting themselves in the foot for lack of risk communication expertise. (This observation is obviously a little self-serving, as I sell risk communication training, but I think it’s also on target.) Although risk communication skills can be learned, they’re not bred in the bone—certainly not for the average nuclear engineer.

Err on the Alarming Side

In the early hours and days of the TMI accident, nobody knew for sure what was happening. That encouraged MetEd to put the best face on things and to make the most reassuring statements it could given what was known at the time. So as the news got worse, MetEd had to keep going back to the public and the authorities to say, in effect, “It’s worse than we thought.”

This violated a cardinal rule of crisis communication: Always err on the alarming side. Make your first communication sufficiently cautious that later communications are likely to take the form of, “it’s not as bad as we feared,” rather than “it’s worse than we thought.” In the 25 years since TMI, I have seen countless corporations and government agencies make the same mistake. Its cost: the source loses all credibility. Because the source is obviously underreacting, everybody else tends to get on the other side of the seesaw and overreact.

That’s why Pennsylvania Governor Dick Thornburgh ordered an evacuation of pregnant women and preschool children. MetEd was saying that the amount of radiation escaping the site didn’t justify any evacuation—and MetEd, it turns out, was right. But MetEd had been understating the seriousness of the accident from the outset. When the head of the Pennsylvania Emergency Management Agency (PEMA) misinterpreted a radiation reading from a helicopter flying through the plume, thinking it was probably an offsite reading of exposures reaching populated areas, Thornburgh didn’t even check with the no-longer-credible utility (which could have told him PEMA had misunderstood the situation). He decided that it was better to be safe than sorry and ordered the evacuation.

In contrast to MetEd, the Pennsylvania Department of Health adopted an appropriately cautious approach. The Department of Health was worried that radioactive iodine-131 (I-131) might escape from the nuclear plant, be deposited on the grass, get eaten by dairy cattle, and end up in local milk. Over a two-week period, health officials issued several warnings urging people not to drink the milk. Meanwhile, they kept doing assays of the milk without finding any I-131. The Department of Health’s announcements moved slowly from “there will probably be I-131 in the milk” to “there may be I-131 in the milk” to “there doesn’t seem to be I-131 in the milk, but let us do one more round of testing just to be sure.”

By the time the Department of Health declared the milk safe to drink, virtually everyone believed it. While the caution hurt the dairy industry briefly, the rebound was quick. Health officials were seen as looking out for people’s health more than for the dairy industry’s short-term profits. This should serve as a model for bovine spongiform encephalopathy (also known as mad cow disease) and the beef industry, for severe acute respiratory syndrome (SARS) and the travel industry, and for avian flu and the poultry industry.

Don’t Lie; Don’t Even Tell Half-Truths

In general, companies and government agencies try hard not to lie outright, but they usually feel entitled to say things that may be technically accurate, but misleading—especially in a crisis when they are trying to keep people calm. Ethics aside, the strategy usually backfires. People learn the other half of the truth, or just sense that they aren’t being leveled with, and their anxiety is exacerbated. Panic is rare in crisis situations; people often feel panicky but usually manage to act rationally and even altruistically. But panic is paradoxically likelier when the authorities are being less than candid in their effort to avert panic.

For example, the nuclear power plant in central Pennsylvania was in deep trouble. The emergency core cooling system had been mistakenly turned off; a hydrogen bubble in the containment structure was considered capable of exploding, which might breach the core vessel and cause a meltdown. In the midst of the crisis, when any number of things were going wrong, MetEd put out a news release claiming that the plant was “cooling according to design.”

Months later I asked the PR director how he could justify such a statement. Nuclear plants are designed to survive a serious accident, he explained. They are designed to protect the public even though many things are going wrong. So even though many things were going wrong at TMI, the plant was, nonetheless, “cooling according to design.” Needless to say, his argument that he hadn’t actually lied did not keep his misleading statement from irreparably damaging the company’s credibility.

Expect the Media to Over-Reassure

In ordinary times, journalists tend to make the news as dramatic as possible; their sensationalist bias is built in. But not in a crisis—that’s when journalists ally with their sources in a misguided effort to keep people calm by being overly reassuring.

The Kemeny Commission (the U.S. government commission set up to investigate TMI) conducted a content analysis of network, wire service, and major newspaper coverage during the first week of the 1979 TMI accident. The commission’s expectations of sensationalism were not confirmed. Of media passages that were clearly either alarming or reassuring in thrust, 60% were reassuring. If you stick to the technical issues, eliminating passages about inadequate flow of information and general expressions of fear from local citizens, the preponderance of reassuring over alarming “technical” statements was 73% to 27%. It didn’t seem that way at the time, of course, for several reasons.

  • Frightened people pick up on negative information more than on positive information. Vincent Covello, director of the Center for Risk Communication in New York, argues that in a crisis it takes three pieces of good news to balance one piece of bad news.
  • The news that something previously assumed safe may or may not be hazardous naturally strikes people as alarming, almost regardless of the amount of attention paid to the two sides. (Imagine reading this evening that scientists disagree over whether your favorite food is carcinogenic.) Sociologist Allan Mazur found that public fearfulness about risky new technologies is proportional to the amount of coverage, not to its character. TMI was a big, big story; even if the content was reassuring, the amount of content was alarming.
  • Most importantly, overly reassuring content is alarming. The public, especially the local public, could tell that the authorities were deeply worried and thoroughly bewildered. In that context, seeing MetEd on TV insisting that the plant was cooling according to design and that everything was under control certainly made things worse.

Reporters at TMI weren’t averse to accusing their sources of withholding information. But they were reluctant to report—reluctant even to notice—how often their sources didn’t know what was going on themselves and how frightened their sources were about what might happen next.

Keep It Simple

The need for simple explanations of complex phenomena isn’t just an axiom of crisis communication; it’s fundamental to any sort of communication. But two things change in a crisis. First, audiences are less tolerant of complexity when they’re upset. Apathetic people just stop listening when they can’t understand what’s being said; interested people ask for clarification. But frightened or angry people decide you’re trying to con them, and therefore become more frightened and more angry.

The second reason why keeping it simple is fundamental in crisis situations is this: Sources tend to speak more complexly when they’re upset. Some of this is unconscious; your anxiety makes you hide behind big words and fancy sentences. Some of it is intentional. Nuclear Regulatory Commission (NRC) officials at TMI were worried (mistakenly, as it turned out) that a hydrogen bubble in the containment structure might explode and cause a meltdown. When they shared this possibility with journalists, they did it in such polysyllabic prose that reporters thought they were denying it, not acknowledging it.

The level of technical jargon was actually higher at TMI when the experts were talking to the public and the news media than when they were talking to each other. The transcripts of urgent telephone conversations between nuclear engineers were usually simpler to understand than the transcripts of news conferences. They said things to each other like, “It looks like we’ve got a load of core damage,” then made the same point to the media in phrases so technical that not one reporter got the message.

To be sure, jargon is a genuine tool of professional communication, conveying meaning (to those with the requisite training) precisely and concisely. But it also serves as a sort of membership badge, a sign of the status difference between the professional and everyone else. And especially in a crisis, it’s a way to avoid looking scared and avoid communicating scary information.

Pay Attention to Outrage

Reporters are a pretty thick-skinned group when it comes to danger—the sort of people who automatically drive toward the scene of any disaster. But they were frightened at TMI. It’s one of the few times I have ever witnessed a roomful of reporters rush a press secretary and demand to be moved further from the story.

Local citizens, obviously, were even likelier to have found the accident terrifying (though it is worth noting that, as usual, there was no panic). The biggest source of outrage at TMI was undoubtedly mistrust—a growing sense that MetEd executives for sure, and maybe NRC officials as well, weren’t saying everything they knew. (The sense that they didn’t know everything they should came later. Officials could have reduced post-crisis recriminations by acknowledging their uncertainty and all the things they wished they knew but didn’t.) As it usually does in crisis situations, the mistrust fed the fear. But there were plenty of other outrage components in play at TMI, including knowability, control, dread, and memorability.

Knowability. Expert disagreement is an aspect of knowability that generates even more outrage and fear than garden-variety uncertainty—and expert disagreement is rampant over the health effects of low levels of radiation. Some experts claim even very small exposures can lead to cancer; others argue that small exposures actually provide health benefits (the so-called hormesis hypothesis).

Another aspect of radiation’s knowability problem is its undetectability. Many reporters at TMI wore radiation monitors, a privilege few ordinary citizens had. Even so, the reporters were nervous. One told me he’d be a lot more comfortable if radiation were purple instead of invisible. Another, a veteran war correspondent, noted: “In a war you worry that you might get hit. The hellish thing here is worrying that you already got hit.”

Control. One of the most important—and difficult—ways to help people cope with a crisis is to offer them things to do. Reporters were busy at TMI, which kept their fears at bay. Local residents, on the other hand, had little to do but follow the media and stew. That feeling of complete powerlessness generated a lot of extra fear. One possibility that was considered and rejected was to distribute potassium iodide (KI). It floods the thyroid with iodine and if there had been much radioactive iodine emitted at TMI (as it turns out there wasn’t), the KI could have prevented some thyroid cancers. But the real issue was one of communication. Would distributing KI scare people by implying there might be serious radiation releases, or would it reassure people by giving them something to do to protect themselves? The former argument won the day, and the KI stayed in the warehouse.

Dread. Cancer is an especially dreaded way to die. And among carcinogens, radiation is an especially dreaded source. Experts have calculated that particulates and other pollutants normally released into the air around TMI 25 years ago were deadlier than the amount of radiation actually released during the accident. By shutting down some factories temporarily, therefore, the accident may even have improved local public health. Despite these data, I still get two or three phone calls and e-mails a year from people who live near TMI, or are thinking of moving to the area, asking my advice on whether it’s safe. And many are still convinced it isn’t.

Memorability. Nuclear disaster has been a feature of science fiction since the early 1950s. Almost everyone who lived through the TMI accident had already seen countless nuclear reactors run amok—in movies, in novels, and in comic books. So it was easy to believe a meltdown was around the corner. It didn’t help that The China Syndrome, a movie about a nuclear power plant disaster, had just opened. Harold Denton, the senior manager the NRC sent to the site, took an evening off to go see the movie in Harrisburg, Pa.; a few hundred reporters (including me) went with him.

Get the Word Out

Most government agencies and corporations respond to crisis situations by constricting the flow of information. Terrified that the wrong people may say the wrong things, they identify one or two spokespeople and decree that nobody else is to do any communicating. In an effort to implement this centralized communication strategy, they do little or nothing to keep the rest of the organization informed.

There is certainly a downside to authorizing lots of spokespeople; the mantra of most risk communication experts is to “speak with one voice.” But I think the disadvantages of the one-voice approach outweigh the advantages. This approach almost always fails, as it failed at TMI. Reporters took down the license plate numbers of MetEd employees, got their addresses, and called them at home after shift.

Inevitably, many talked—though what they knew was patchy and often mistaken. The designated information people for the NRC and the utility, meanwhile, had trouble getting their own information updates; those in the know were too busy coping with the accident to brief them. (The lesson here is that there should be technical experts at the scene whose designated job is to shuttle between the people who are managing the crisis and the people who are explaining it.) The state government felt its own information was so incomplete that Press Secretary Paul Critchlow asked one of his staff to play de facto reporter in an attempt to find out what was going on so Critchlow could tell the media and the governor.

While the utility and the federal government tried to speak with one voice, the local anti-nuclear movement stopped speaking altogether. During the accident, hundreds of reporters called the Harrisburg office of TMI Alert, the area’s major anti-nuke group. They got a recorded message explaining that the staff had left town for their own safety.

In today’s world of 24/7 news coverage and the Internet, the information genie is out of the bottle. If official sources withhold information, we get it from unofficial sources; if official sources speak with one voice, we smell a rat and seek out other voices . . . and find them. But crisis information wasn’t controllable 25 years ago in central Pennsylvania either. As my wife and colleague Jody Lanard likes to point out, even in the pre-Gutenberg era, everyone in medieval villages knew when troubles were brewing. The information genie never was in the bottle. Keeping people informed and letting them talk is a wiser strategy than keeping them ignorant and hoping they won’t.

—Peter M. Sandman ([email protected]) is professor of human ecology at Rutgers University and professor of environmental and community medicine at the Robert Wood Johnson Medical School. For more on Dr. Sandman’s approach to risk communication, see www.psandman.com. This article is based on one published in Safety at Work (April 2004), www.safetyatwork.biz. Published with permission.

SHARE this article