WHEN hundreds of jetBlue passengers were left stranded during a harsh Valentine’s Day ice storm, it was because the company was forced to cancel more than 250 of its 505 daily flights. In total, jetBlue issued $10 million in refunds on canceled flights and $16 million in travel vouchers for delayed passengers.
JetBlue officials called their delays “unacceptable,” and have made substantial efforts to repair the company’s image. The company published a full-page ad apologizing for the delays in 15 cities and 20 newspapers. JetBlue founder and CEO David Neeleman posted a video apology on YouTube. The company even unveiled a “Customer Bill of Rights” that guarantees graduated compensation for late flights.
Not too far from airline passengers stranded in airports, hundreds of Pennsylvania motorists were stranded in their vehicles after the same ice storm caused a string of accidents and a 50-mile backup on a major interstate highway. People reported being stranded for up to 20 hours without food, water or heat, and the National Guard had to be called in to help the trapped drivers.
Pennsylvania Governor Ed Rendell responded by holding a news conference and apologizing for the state’s failed response. He called their effort “unacceptable” and called for an investigation into how state agencies responded.
“It was not a good day for state government,” he admitted. “As the chief executive of the commonwealth, I take full responsibility for what appears to be a complete breakdown of communications and personally apologize to anyone who was stranded.”
When the government leaves us stranded on the highways, we resignedly accept that “it could have been worse.” But if an airline leaves us stranded on the tarmac, we yell and scream until we get a refund. So why do we demand so much from the providers of our second most popular form of transportation, and so little from the providers of our first?
University of Pennsylvania psychologist Jon Baron has shown that people tend to rate harmful omissions as less immoral – less ‘bad’ as decisions – than harmful commissions. Holding intentions, motives, and consequences constant, when something bad happens people prefer it to be a result of someone’s inaction (omission), rather than a consequence of someone’s direct action (commission).
One cause of this bias towards omission is the exaggeration effect, a finding that subjects feel more regret when bad outcomes result from action than when they result from inaction. In inverse is also true, people are happier when positive outcomes are the result of action rather than inaction.
Loss aversion is another psychological mechanism that contributes to omission bias. We perceive gains as less good than we perceive equivalent losses as bad. In real terms, this means we are more upset about losing $100 than we are happy gaining the same amount. In terms of the Valentine’s Day ice storm, we are more upset about being delayed an hour in traffic than we are gaining an hour in transit.
Another reason to expect omission bias is that we tend to perceive someone as less responsible when there is a plethora of causes. “This can be an illusion,” notes Baron in his paper on omission bias. “What matters for decision making is the contingency of various outcomes on the actor’s options.” In other words, we can lose sight of the consequences of inaction if we start to assign blame to various causes.
Overgeneralization is a further explanation for omission bias. In the general sense, thoughtlessness and failure to act do not seem irrational or immoral. But when put into specific context — i.e. when thoughtlessness results in hundreds of people being stranded on a highway — one gets a better understanding of the rationality and morality of the decision.
Whatever the cause, omission bias has a very real impact on society. It forms the psychological underpinnings for arguments about morals and rationality. Omission bias is why withholding the truth is not as bad as lying, and why letting somebody die is not as bad as killing him.
In relation to public affairs, it has a doubly-negative impact: (1) It induces government officials to make irresponsible decisions, and it (2) encourages the public to accept these bad decisions.
TO ILLUSTRATE how bias can affect governmental decision making, consider the following thought experiment.
You are the head of emergency management for the state of Pennsylvania. Your team has concluded that there is a 50 percent chance that a disastrous storm is going to hit the state. Your team has presented you with two options, both with potentially negative outcomes:
(1) Warn the state departments of Transportation, Health, State Police, Public Welfare and General Services and prepare an emergency plan. If you are wrong and the storm doesn’t hit (a 50 percent chance), 1,000 state workers will waste 50 hours each by preparing for the storm in vain.
(2) Wait to warn state departmental officials and don’t prepare an emergency plan. If you are wrong and the storm does hit (again, a 50 percent chance), 5,000 people will waste 10 hours each as they sit piled up on the highway.
In terms of consequence, the two decisions are the same: Both result in 50,000 man-hours wasted. Government officials should therefore be indifferent among the options. Nevertheless, subjects routinely choose the second choice in experiments structured like the above situation, leaving regular citizens to suffer the consequences of government inaction.
In regards to the Valentine’ Day ice storm, the government’s failure to act resulted in consequences far worse than the equivalent cost for government to have taken preventative action. As with Hurricane Katrina, simple emergency planning would have gone a long way to prevent such a bad outcome. Either the government underestimated the cost of inaction, or it overestimated the cost of government action. As it universally uncommon for government to underestimate the cost of action — instead, usually going millions over budget — we can assume that government is underestimating the cost of inaction.
Unfortunately, the Valentine’s Day ice storm and Hurricane Katrina are not the only examples of how government susceptibility to omission bias harms society. When a potentially beneficial drug lingers unapproved in the FDA, omission bias hurts society. When the Supreme Court refuses to hear a case because it’s too controversial in nature, omission bias hurts society. All over government, and all around the world, omission bias routinely causes government to make irresponsible decisions and encourages us to accept them.
JetBlue has proved that large organizations are capable of reform. Public apologies, promises to do better and the new Customer Bill of Rights are all mechanisms jetBlue is using to safeguard the company against omission bias. The government would do well to take notes, as each increases incentives for the company to make better decisions.
By recognizing their inaction, the government has taken the first step to protect citizens from the harmful effects of omission bias. But to make sure that the government permanently protects citizens from their inaction, the government needs to make a substantial effort to bind itself to action. We need more than just “an investigation,” we need safeguards. While this doesn’t mean the government needs to update the Bill of Rights, as did jetBlue, it means that something needs to be done before the next big storm hits.
Taylor W. Buley is author of “The Fresh Politics Reader” and is the founding editor-in-chief of the Pennsylvania Independent, a student newspaper at the University of Pennsylvania.
Source: AFF Doublethink Online | Joseph Hammond
Source: AFF Doublethink Online | Andrew Stiles