Wednesday, July 9, 2014

The facts and why we don't believe them

"Facts do not cease to exist because they are ignored." — Aldous Huxley

IT IS BOTH frightening and mind-boggling to me that as a species, human beings could go from possessing the scientific, medical, engineering and mathematical knowledge that the Egyptians, Greeks, RomansArabs, Persians, Chinese and other cultures amassed over the course of thousands of years to the ignorance, superstition and all around mud-wallowing of the dark ages. 

I don't doubt that we're capable of going there again if separation of church and state isn't absolute so as to preserve (some of might say "arrive at") evidence-based science, education and public policy. 

Unfortunately, facts don't seem to make much of a difference. Below is a piece written by Brendan Nyhan for the Upshot series for The New York Times.

And below that is a piece I shared with you back in August 2012 from The New York Times Magazine about how we can possibly influence those whose beliefs run counter to proven fact.

When Beliefs and Facts Collide

By Brendan Nyhan 

July 5, 2014

Do Americans understand the scientific consensus about issues like climate change and evolution?

At least for a substantial portion of the public, it seems like the answer is no. The Pew Research Center, for instance, found that 33 percent of the public believes “Humans and other living things have existed in their present form since the beginning of time” and 26 percent think there is not “solid evidence that the average temperature on Earth has been getting warmer over the past few decades.” Unsurprisingly, beliefs on both topics are divided along religious and partisan lines. For instance, 46 percent of Republicans said there is not solid evidence of global warming, compared with 11 percent of Democrats.

As a result of surveys like these, scientists and advocates have concluded that many people are not aware of the evidence on these issues and need to be provided with correct information. That’s the impulse behind efforts like the campaign to publicize the fact that 97 percent of climate scientists believe human activities are causing global warming.

In a new study, a Yale Law School professor, Dan Kahan, finds that the divide over belief in evolution between more and less religious people is wider among people who otherwise show familiarity with math and science, which suggests that the problem isn’t a lack of information. When he instead tested whether respondents knew the theory of evolution, omitting mention of belief, there was virtually no difference between more and less religious people with high scientific familiarity. In other words, religious people knew the science; they just weren’t willing to say that they believed in it.

Mr. Kahan’s study suggests that more people know what scientists think about high-profile scientific controversies than polls suggest; they just aren’t willing to endorse the consensus when it contradicts their political or religious views. This finding helps us understand why my colleagues and I have found that factual and scientific evidence is often ineffective at reducing misperceptions and can even backfire on issues like weapons of mass destruction, health care reform and vaccines. With science as with politics, identity often trumps the facts.

So what should we do? One implication of Mr. Kahan’s study and other research in this field is that we need to try to break the association between identity and factual beliefs on high-profile issues – for instance, by making clear that you can believe in human-induced climate change and still be a conservative Republican like former Representative Bob Inglis or an evangelical Christian like the climate scientist Katharine Hayhoe.

But we also need to reduce the incentives for elites to spread misinformation to their followers in the first place. Once people’s cultural and political views get tied up in their factual beliefs, it’s very difficult to undo regardless of the messaging that is used.

It may be possible for institutions to help people set aside their political identities and engage with science more dispassionately under certain circumstances, especially at the local level. Mr. Kahan points, for instance, to the relatively inclusive and constructive deliberations that were conducted among citizens in Southeast Florida about responding to climate change. However, this experience may be hard to replicate – on the Outer Banks of North Carolina, another threatened coastal area, the debate over projected sea level rises has already become highly polarized.

The deeper problem is that citizens participate in public life precisely because they believe the issues at stake relate to their values and ideals, especially when political parties and other identity-based groups get involved – an outcome that is inevitable on high-profile issues. Those groups can help to mobilize the public and represent their interests, but they also help to produce the factual divisions that are one of the most toxic byproducts of our polarized era. Unfortunately, knowing what scientists think is ultimately no substitute for actually believing it.

How to Move a Mind

By Maggie Koerth-Baker

August, 19, 2012

Forget for a minute everything you know about politics. Barack Obama now openly supports gay marriage. Mitt Romney now opposes roughly the same kind of health care reform he fought for as governor of Massachusetts. What if they weren’t two politicians calculating how to win an election but instead just two guys who changed their minds? They didn’t “flip-flop”; they experienced, as social scientists say, an attitude change, the way any of us do when we become a vegetarian or befriend a neighbor we used to hate or even just choose to buy a new brand of toothpaste.

In the last decade, psychologists have focused increasing attention on moral attitudes. Jonathan Haidt, professor of psychology at the Stern School of Business at New York University and author of “The Righteous Mind,” told me that researchers have been especially interested in the way emotions and attitudes interact. Moral attitudes are especially difficult to change, Haidt said, because the emotions attached to those preferences largely define who we are. “Certain beliefs are so important for a society or group that they become part of how you prove your identity,” he said. “It’s as though we circle around these ideas. It’s how we become one.”

We tend to side with people who share our identity — even when the facts disagree — and calling someone a flip-flopper is a way of calling them morally suspect, as if those who change their minds are in some way being unfaithful to their group. This is nonsense, of course. People change their minds all the time, even about very important matters. It’s just hard to do when the stakes are high. That’s why marshaling data and making rational arguments won’t work. Whether you’re changing your own mind or someone else’s, the key is emotional, persuasive storytelling.

In 2006, researchers from Ohio State University and Colorado State University demonstrated that a well-written TV drama can change the political opinions of college students. They split 178 students into two groups. One watched a crime show that told a persuasive story about the value of the death penalty. The other group watched a different, unrelated drama. Afterward, both groups were interviewed about their personal beliefs and their opinions on the death penalty. The students who watched the crime show were more likely to support the death penalty. In fact, support for the death penalty was about the same whether those students self-identified as liberal or conservative. That wasn’t true among the students who watched the other show. There, political ideology strongly predicted their opinions on the death penalty.

Timothy Wilson is a psychology professor at the University of Virginia and the author of the book “Redirect,” about how we change our minds and behavior. Stories are more powerful than data, Wilson says, because they allow individuals to identify emotionally with ideas and people they might otherwise see as “outsiders.” Wilson says researchers speculate that children who grew up seeing friendly gay people on TV will be more likely to support gay marriage as adults, regardless of other political affiliations and religious beliefs. Once you care about a character, Wilson says, you can find a way to fit them into your identity.

Our identities, of course, are also stories we tell ourselves about ourselves. In some cases — if we want to think of ourselves as thoughtful and open-minded — we can adopt identities that actually encourage flip-flopping. This is why juries function, and it’s what places pressure on scientists to form opinions based on reliable data. In 2009, the Oregon Legislature mandated the creation of the Oregon Citizens’ Initiative Review, panels made up of random residents assigned to review and assess ballot initiatives in “citizens’ statements.” The panelists know they’re expected to base their opinions on hard evidence, and this expectation becomes part of their temporary identity.

Under those conditions, says John Gastil, professor of communication arts and sciences at Penn State, facts suddenly matter. He points to Measure 73, a widely popular mandatory sentencing initiative, which the citizens’ panel voted against, 21 to 3. The panelists felt obligated to consider the measure more carefully than they otherwise would have, Gastil says, so they noted the high costs and thought about people who might be unfairly punished. Only a minority of voters knew the panel existed, so the measure still passed — though by a smaller margin than expected. In a study he performed on the public response to Measure 73, Gastil found that the panel’s opinion significantly changed the minds of those people who read its findings. “You got a shift from two-thirds in favor to two-thirds against just by reading the report,” Gastil says.

Simply having to articulate why you believe what you do can also end up changing your attitude. Timothy Wilson and his colleagues showed posters to people and asked some of them to explain why they liked or disliked the images. Then they allowed every participant to take one poster home. The people who had to explain their preferences chose the poster they most easily justified liking. But when they were called a week later, those same people were least satisfied with their decision. “They talked themselves into choosing something they really didn’t like that much,” Wilson says, noting that if you have to explain your preferences, you’re likely to adopt an attitude that makes sense to your interlocutor, even if it conflicts with your emotions.

Even when we do change our minds, we often convince ourselves that we haven’t. Wilson points to the work of Michael Ross, professor emeritus of social psychology at the University of Waterloo in Canada. Since the 1970s, Ross has been studying autobiographies and has found that authors largely distort their pasts, depending on what point of their story they want to emphasize. The end result, it always turns out, was where they were heading all along.

All of this can help explain how a couple of politicians might change their minds but feel they haven’t actually changed their beliefs. Obama has said conversations with his daughters about friends of theirs with gay parents affected his thinking. Romney has had to spend a lot of time explaining his beliefs and past decisions to groups of very conservative voters, who weren’t inclined to accept him as part of their tribe, and perhaps that process has genuinely led him to question his thinking on health care. Of course, both men could just be pandering for votes.

We change our minds for utilitarian reasons, too. Especially pragmatic thinkers will consider whether an idea is feasible before they adopt it, says Michael Slater, professor of communication and behavioral sciences at Ohio State University.

But even in Washington, understanding the power of stories could go a long ways toward bridging gaps that only get bigger when we expect those who disagree to rationally accept data and evidence. “We fight it out by throwing arguments at each other and are upset when they have no effect,” Haidt says. “It makes us accuse our opponents of bad faith and ulterior motives. But the truth is that our minds just aren’t set up to be changed by mere evidence and argument presented by a ‘stranger.’ ”

No comments:

Post a Comment