The Consequences People Don’t See Coming (But Probably Should)
CMU research reveals how "consequence neglect" leads to predictable surprises in policy, leadership and everyday life
At the 2024 Paris Olympics, organizers opted for an eco-friendly cooling system in athlete housing, skipping central air in favor of a low-energy alternative. Countries responded by bringing their own air conditioning units — a move that undercut the original environmental goal. According to researchers at Carnegie Mellon University’s Dietrich College of Humanities and Social Sciences, this misstep is not an outlier, but part of a common thinking error known as “consequence neglect.”
In a paper published today in PLOS One, Christopher Rodriguez, a graduate student in behavioral decision research and his adviser, Daniel Oppenheimer, a professor of social and decision sciences, said that people often focus too narrowly on solutions to the problem at hand and overlook foreseeable negative outcomes.
Solutions can cause more problems.
But they don’t have to, Oppenheimer said. Recognizing consequence neglect could change decision-making at all levels, from everyday choices to national policy.
“When people are making policy, it is not natural to think of the consequences because you're focused on the problem that you're trying to solve, not the other things that are going on,” he explained. “This research matters because it'll help us develop better policies with fewer side effects or negative consequences.”
Think twice or pay the price
The researchers said consequence neglect can happen to anyone — team leaders, policymakers, teachers, parents, students and business owners, to name a few. They wanted to see what happened when people were asked to take time to specifically think about repercussions.
Participants in their study first rated the effectiveness of proposed solutions to six everyday problems (e.g., curbing excessive college drinking) using a Likert scale. Next, they generated their own solution to a seventh problem, office supply theft, and rated its effectiveness.
Finally, participants did a “consequence generation task,” listing two positive and two negative consequences for each policy, including their own, and re-evaluated all seven solutions.
That changed things, Rodriguez said.
“Having participants explicitly sit down and think of those consequences, regardless of whether they were the creators of the policy, really made them think differently. People are capable of thinking of the consequences, but without being prompted to, oftentimes, they simply never try and those consequences go neglected,” he said.
For example, when participants thought of potential negative consequences to a policy that completely banned alcohol on campus, they realized that more students might drive off campus to drink, putting them at additional risk.
How can a surprise be predictable?
The researchers have identified consequence neglect as one driver of predictable surprises, events that can catch people off guard, even though they could have seen them coming.
“There are some psychological mechanisms at play with predictable surprises, like climate change,” Rodriguez said. “Our research shows that consequence neglect is one reason why they happen.”
Oppenheimer said there are actions policymakers and team leaders can take every day to prevent unintended results, like taking a few minutes to do the consequence generation task. Even something like hiring an employee who likes to play the devil’s advocate can be beneficial.
“It often isn’t very hard to think of preventable consequences of your actions, but most of the time we aren’t in the habit of doing so, because that’s not what we’re trying to do — we’re too focused on the problem we’re trying to solve to think about what the side effects of the solution will be. But a devil’s advocate’s job is to poke holes in the plan, and if that’s what we’re trying to do, we often have no trouble identifying undesirable outcomes that are easy to forestall.”
Everyone has tunnel vision sometimes
Foreseeable consequences might come to mind when watching the news, Oppenheimer and Rodriguez said. They point to situations like Brexit, discussions about privatizing the United States Postal Service and withdrawing funds from the United States Agency for International Development as examples.
But Oppenheimer said it’s important to remember that it’s a cognitive bias that affects everyone.
“What we are demonstrating is a human phenomenon, not a liberal phenomenon, not a conservative phenomenon. Either side will be able to find countless examples of how the other side screwed up by not considering the consequences,” he said. “For anyone, taking time to consider what will happen would lead to markedly better policy being made and implemented, and I think the world would be better for that.”