“The DNA is a Match”: Confirmation Bias

Podcast

We have access to more information than ever before, but are we making better decisions? Learn how a pervasive bias skews how we interpret the data in front of us.

Transcript

NARRATOR: What would you think if you watched a forensic expert get on the witness stand and say 5 little words:

CLIP: “The DNA is a match”?

NARRATOR: The first thought that went through my mind? Got ‘em.

DNA is the gold standard of forensic evidence; indisputable, right? I mean, we’re talking about the stuff we send out to the lab to find out if we’re .2% Italian (or in my neighbor’s case: 2% Neanderthal)! But in 2011, neuroscientist Itiel Dror and biologist Greg Hampikian teamed up to show that even DNA analysis can be vulnerable our blind spots.

Today, we’re talking about one of the strongest biases out there: confirmation bias. Welcome to Outsmarting Implicit Bias.

Imagine this: you’re a forensic analyst on a criminal case.

CLIP: “Fingerprint check on Panacek.”

NARRATOR: The prosecutors tell you they have testimony from one of the suspects…

CLIP: “I will give you immunity if you testify.”

NARRATOR: …and he’s revealed everyone else who was involved in the crime. The only snag? Unless you can place at least one of the other people at the scene, the judge is probably throwing this testimony out.

So: those are the stakes. You have the DNA mixture from the crime (that means it includes more than one person’s DNA). And you have DNA from the prime suspects. So. Is there a match?

These are details from a real criminal case. And the real analysis team… did find a match! Suspect 3 was there. “Got ‘em”, right?

Not exactly. Years later, Dror and Hampikian sent the same DNA to 17 new experts: Did they see Suspect 3’s DNA in the mixture?

16 out of 17 experts said no.

This is the kind of situation where I think it’s fair to ask “What the heck happened?”

16 “nos” seems like the opposite of a “gotcha” moment. And the original analysis team was qualified and competent — how did they see something so different?

The answer: confirmation bias. Our tendency to search for or interpret new information in a way that supports what we already believe.

You see, the one difference between the original analysts and the new ones was what they thought was true. The original team knew that there was incriminating testimony; they knew that without a DNA match, the accused could go free. The new analysts didn’t. So they were looking at the data with untainted eyes.

And while DNA analyses are basically error-proof when it comes to single samples, when it comes DNA mixtures, there’s a lot more ambiguity. Is that really an allele, or just a blip in the data? Is there really enough information to make a conclusion, or is the sample too degraded or complex to even try?

CLIP: “The blood and DNA tests came back inconclusive.”

NARRATOR: These are the kinds of decisions points where bias can come in – even in honest people who are trying to be accurate!

Confirmation biases can affect all of us… and what makes them so hard to outsmart is that we think we’re finding data that proves us right! For instance: let’s say you just know that a colleague didn’t deserve that promotion. You see him make a little mistake and aha! Proof, right? …even if it’s a slip you would have forgiven in anyone else.

What’s more, researchers at SUNY Stony Brook University found that when our opinions are challenged – for instance, if you have to read both sides of a debate about gun control or affirmative action – our original beliefs only get stronger after hearing the opposite side. This is bizarre!

And it happens because we pick and choose. We focus on the strengths of the arguments we like, and the flaws of the arguments we don’t. And that leaves us feeling even more firm in our convictions.

There are so many ways the confirmation bias can affect us: Maybe you want to invest in a new business venture, and that’s making you downplay the risks. Maybe you’re completely smitten by a new crush, so you miss seeing all the red flags. All this tells us that if we want to outsmart our minds, we have to learn how to switch from confirming an idea to proving it.

Here’s a few tips:

The desire to confirm can give us tunnel vision, but hindsight is 20/20. So get used to conducting premortems. That is: imagine that the unthinkable HAS happened.

CLIP: “Breaking headlines…”

NARRATOR: The company did go under.

CLIP: “…filing for bankruptcy…”

NARRATOR: The relationship did implode.

CLIP: “…the two reportedly broke up…”

NARRATOR: What killed it? What signs did you miss? Working backwards like this pushes you to find the clues that fell into your blind spot.

Another solution for the hypochondriacs among us: rethink what you’re typing into that search bar. “Do I have cancer?” is only going to confirm your worst fears, not give you the most likely answers. Instead, plug in your symptoms (or better yet) consult a professional.

Finally, reframe your goals. Don’t make half-hearted pro-con lists that just tell you want to hear. Whether it’s a personal decision or an analysis for a Supreme Court case, make sure everyone knows “success” means making the best decision, and not just the one you think is right.

Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Evan Younger, Kirsten Morehouse, and Mahzarin Banaji. Research Assistants include Moshe Poliak and Megan Burns. Music by Miracles of Modern Science. Support comes from Harvard University, PwC, and Johnson & Johnson. For references and related materials, visit outsmartinghumanminds.org.

Expand

Transcript

NARRATOR: What would you think if you watched a forensic expert get on the witness stand and say 5 little words:

CLIP: “The DNA is a match”?

NARRATOR: The first thought that went through my mind? Got ‘em.

DNA is the gold standard of forensic evidence; indisputable, right? I mean, we’re talking about the stuff we send out to the lab to find out if we’re .2% Italian (or in my neighbor’s case: 2% Neanderthal)! But in 2011, neuroscientist Itiel Dror and biologist Greg Hampikian teamed up to show that even DNA analysis can be vulnerable our blind spots.

Today, we’re talking about one of the strongest biases out there: confirmation bias. Welcome to Outsmarting Implicit Bias.

Imagine this: you’re a forensic analyst on a criminal case.

CLIP: “Fingerprint check on Panacek.”

NARRATOR: The prosecutors tell you they have testimony from one of the suspects…

CLIP: “I will give you immunity if you testify.”

NARRATOR: …and he’s revealed everyone else who was involved in the crime. The only snag? Unless you can place at least one of the other people at the scene, the judge is probably throwing this testimony out.

So: those are the stakes. You have the DNA mixture from the crime (that means it includes more than one person’s DNA). And you have DNA from the prime suspects. So. Is there a match?

These are details from a real criminal case. And the real analysis team… did find a match! Suspect 3 was there. “Got ‘em”, right?

Not exactly. Years later, Dror and Hampikian sent the same DNA to 17 new experts: Did they see Suspect 3’s DNA in the mixture?

16 out of 17 experts said no.

This is the kind of situation where I think it’s fair to ask “What the heck happened?”

16 “nos” seems like the opposite of a “gotcha” moment. And the original analysis team was qualified and competent — how did they see something so different?

The answer: confirmation bias. Our tendency to search for or interpret new information in a way that supports what we already believe.

You see, the one difference between the original analysts and the new ones was what they thought was true. The original team knew that there was incriminating testimony; they knew that without a DNA match, the accused could go free. The new analysts didn’t. So they were looking at the data with untainted eyes.

And while DNA analyses are basically error-proof when it comes to single samples, when it comes DNA mixtures, there’s a lot more ambiguity. Is that really an allele, or just a blip in the data? Is there really enough information to make a conclusion, or is the sample too degraded or complex to even try?

CLIP: “The blood and DNA tests came back inconclusive.”

NARRATOR: These are the kinds of decisions points where bias can come in – even in honest people who are trying to be accurate!

Confirmation biases can affect all of us… and what makes them so hard to outsmart is that we think we’re finding data that proves us right! For instance: let’s say you just know that a colleague didn’t deserve that promotion. You see him make a little mistake and aha! Proof, right? …even if it’s a slip you would have forgiven in anyone else.

What’s more, researchers at SUNY Stony Brook University found that when our opinions are challenged – for instance, if you have to read both sides of a debate about gun control or affirmative action – our original beliefs only get stronger after hearing the opposite side. This is bizarre!

And it happens because we pick and choose. We focus on the strengths of the arguments we like, and the flaws of the arguments we don’t. And that leaves us feeling even more firm in our convictions.

There are so many ways the confirmation bias can affect us: Maybe you want to invest in a new business venture, and that’s making you downplay the risks. Maybe you’re completely smitten by a new crush, so you miss seeing all the red flags. All this tells us that if we want to outsmart our minds, we have to learn how to switch from confirming an idea to proving it.

Here’s a few tips:

The desire to confirm can give us tunnel vision, but hindsight is 20/20. So get used to conducting premortems. That is: imagine that the unthinkable HAS happened.

CLIP: “Breaking headlines…”

NARRATOR: The company did go under.

CLIP: “…filing for bankruptcy…”

NARRATOR: The relationship did implode.

CLIP: “…the two reportedly broke up…”

NARRATOR: What killed it? What signs did you miss? Working backwards like this pushes you to find the clues that fell into your blind spot.

Another solution for the hypochondriacs among us: rethink what you’re typing into that search bar. “Do I have cancer?” is only going to confirm your worst fears, not give you the most likely answers. Instead, plug in your symptoms (or better yet) consult a professional.

Finally, reframe your goals. Don’t make half-hearted pro-con lists that just tell you want to hear. Whether it’s a personal decision or an analysis for a Supreme Court case, make sure everyone knows “success” means making the best decision, and not just the one you think is right.

Outsmarting Implicit Bias is a project founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. The team includes Olivia Kang, Evan Younger, Kirsten Morehouse, and Mahzarin Banaji. Research Assistants include Moshe Poliak and Megan Burns. Music by Miracles of Modern Science. Support comes from Harvard University, PwC, and Johnson & Johnson. For references and related materials, visit outsmartinghumanminds.org.

Expand

Subscribe to Outsmarting Implicit Bias

Highlights

Key takeaways from this module

Dive deeper

Extra materials if you want to learn more

Links

Have you ever taken a personality test such as the Myers-Briggs Type Indicator? You answer a list of questions, and in the end you get four letters that seem to describe you to a “T”. Here’s the problem: these letters apply to lots of people. But like weekly astrological predictions, when we read these general descriptions confirming our own personality or experiences, we don’t realize that they would seem equally real to everybody. And as we keep reading, we keep confirming – ignoring imprecisions and delighting in the accuracies. Personality tests are fine when they’re just a fun activity, but we can run into issues when we use them as a hiring and promotion tool. Read more at the New York Times.

The sequence 2, 4, 8 follows a rule. Can you figure out what it is? Try to outsmart the confirmation bias with this demo from the New York Times.

“We all have a few people in our social media networks who share ridiculous things [… b]ut most of us have also encountered well-informed, sane people who share articles that are blatantly incorrect propaganda. Why does this happen?” Read more in Jeff Stibel’s “Fake news: How our brains lead us into echo chambers that promote racism and sexism” (USA Today).

Warren Buffett is one the most successful investors in history. His secret? He “acknowledges that even his decisions could be swayed by [confirmation bias] – an important first step – and then gives voice to opinions that contradict his own.” Read more about how to think (and invest) like Buffet from Forbes.

“It’s easy to assume that presenting factual information will automatically change people’s minds, but messages can have complex, frustrating persuasive effects […] It turned out that climate change skeptics – whether politically conservative or liberal – showed more resistance to the stories that mentioned climate change. Climate change themes also made skeptics more likely to downplay the severity of the disasters. At the same time, the same articles made people who accept climate change perceive the hazards as more severe.” Read more from Professor Ryan Weber’s “Extreme weather news may not change climate change skeptics’ minds” at The Conversation.

References

Dror, I. E., & Hampikian, G. (2011). Subjectivity and bias in forensic DNA mixture interpretation. Science & Justice, 51(4), 204-208.

Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129-140.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.

Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769.

Bodell, L. (2016). Kill the Company: End the Status Quo, Start an Innovation Revolution. Routledge.

Credits

“The DNA is a Match”: Confirmation Bias was created and developed by Olivia Kang, Kirsten Morehouse, Evan Younger, and Mahzarin Banaji. Research and Development Assistants for this episode include Moshe Poliak and Megan Burns. Outsmarting Implicit Bias is supported by Harvard University, PwC, and Johnson & Johnson.

Narration by Olivia Kang

Sound Editing & Mixing by Evan Younger

Music by Miracles of Modern Science

Artwork by Olivia Kang and Megan Burns