Minneapolis
A Note on What People Think They Know
Obviously, what we are seeing in Minneapolis includes tragedy and horror, and what I am about to explore is not, by any means, the most important thing about it. Not close. Still, it is an important thing.
We are going to say a few things about confirmation bias, motivated reasoning, asymmetrical updating, and echo chambers, and also about what to do about them. (In terms of current events: Things are changing rapidly, of course, so we will try to stick to general principles.)
1
What do people know? If you see a football game, are the referees biased? Are they favoring the New England Patriots, or are they biased against them?
(My own view: They are biased against them. Poor, heroic Patriots, having to overcome biased referees! But then again, I am a Patriots fan. Note: This is just for purposes of illustration. I don’t really think the referees are biased against the Patriots. At least not always.)
Suppose you have a view about something - say, the minimum wage, ICE, Ukraine, climate change, or social media use. Now suppose you receive some information that fortifies your belief. How will you update your views, having received that information?
Alternativly, suppose you receive some information that undermines your belief. How will you update yours views, having received that information?
One possibility is that you will believe information that supports your view, and dismiss information that undermines your view. This is confirmation bias. Confirmation bias might in turn be a product of motivated reasoning. People believe what they want to believe. They are motivated to have their views confirmed. They like it when their views are confirmed.
(Don’t you? Most people do. Parenthetical note: I was lucky enough to work for many years with Danny Kahneman, an expert on this topic. He liked it when his views were disconfirmed. He found that energizing.)
2
A few years ago, I participated in a research project on how people update their beliefs when they receive new information about climate change. (The paper can be found here: https://scholarship.law.cornell.edu/cgi/viewcontent.cgi?article=4736&context=clr)
The central findings are consistent with confirmation bias. In brief:
People who are not so worried about climate change updated their views when they received information suggesting that the problem is less serious than they thought - but did not update at all when they received information suggesting that the problem is more serious than they thought.
By contrast, people who are keenly worried about climate change updated a lot when they received information suggesting that the problem is more serious than they thought - but they did not update much when they received information suggesting that the problem is less serious than they thought.
From the abstract:
People are frequently exposed to competing evidence about climate change. We examined how new information alters people’s beliefs. We find that people who are not sure that man-made climate change is occurring, and who do not favor an international agreement to reduce greenhouse gas emissions, show a form of asymmetrical updating: They change their beliefs in response to unexpected good news (suggesting that average temperature rise is likely to be less than previously thought) and fail to change their beliefs in response to unexpected bad news (suggesting that average temperature rise is likely to be greater than previously thought). By contrast, people who strongly believe that man-made climate change is occurring, and who favor an international agreement, show the opposite asymmetry: They change their beliefs far more in response to unexpected bad news (suggesting that average temperature rise is likely to be greater than previously thought) than in response to unexpected good news (suggesting that average temperature rise is likely to be smaller than previously thought).
Ok then.
3
Something like the climate change experiment can be found in the real world every day. People receive information. They might live in an echo chamber, in which case they will update on the basis of what they learn there, which will tend to confirm their beliefs. Or they might be exposed to diverse inputs, and they might give weight to whatever confirms their initial beliefs, and dismiss what contradicts their initial beliefs.
4
I have been emphasizing confirmation bias and motivated reasoning, but there are two important supplemental points.
First: You might learn something that does not confirm your beliefs, but that nonetheless makes you happy. You might learn, for example, that you are very competent at a task, and that information might be disconfirming. What then? We have some evidence that people will be inclined to credit pleasing information, even if it is disconfirming. https://pmc.ncbi.nlm.nih.gov/articles/PMC5536309/
Second: Asymmetrical updating can occur without confirmation bias or motivated reasoning. People might be good Bayesians, rationally updating given their priors. I think that dropped objects fall; I will credit information that is consistent with that firm belief and dismiss (apparent) information that goes the other way. I think that Labrador Retrievers are great; I will credit information that is consistent with that very firm belief and dismiss (apparent) information that goes the other way.
The findings in our climate study are consistent with motivated reasoning, but the asymmetrical updaters, in our experiment, might also be weighting new information in a way that reflects their priors.
5
Consider some current issues in this light. Was the 2020 election stolen? What exactly has been happening in Minneapolis? Are tariffs a good idea? Should the United States leave the Paris climate agreement?
We are seeing a lot of asymmetrical updating on those questions. Asymmetrical updating helps promote polarization. In the immediate aftermath of recent horrors in Minneapolis, we saw a lot of asymmetrical updating.
As I write, initial polarization seems to be diminishing, at least, with respect to what has happened in Minneapolis. There seems to be mounting agreement (not close to complete, to be sure; but mounting). And indeed, there are general points about how to combat asymmetrical updating.
(a) Purely factual corrections from (b) trustworthy sources can make all the difference. Note the importance of the source. (https://link.springer.com/article/10.1007/s11109-018-9443-y) The source can be known to be objective — or known to share the priors and the commitments of those who are doubtful.
Confidence can help. People often use a “confidence heuristic”; they trust those who are clear about what’s right. (https://psycnet.apa.org/fulltext/2018-41338-001.html) Note that this is subject to (1) above.
“Surprising validators” matter. (See https://www.nber.org/papers/w18975) If, for example, someone known to be enthusiastic about deregulation says that climate change is a serious problem, people who are inclined not to worry about climate change might update a fair bit.
Identity matters. People might define themselves in certain terms, and those who share that identity will have more weight than those who do not.
With respect to recent events in Minneapolis, (1), (2), (3), and (4) are having an impact. There are broader lessons here.

