18Champs
Member
Do you have a good B.S. detector? You need one in our digital age.
The skill of spotting false information—rubbish, nonsense and, yes, fake news—is so important these days that scientists have begun serious research on it. They’re attempting to quantify when and why people spread it, who is susceptible to it, and how people can confront it.
This month in Atlanta, at the annual conference of the Society for Personality and Social Psychology, a group of psychologists and other scientists presented a symposium on their research. The title? “Bullshitting: Empirical and Experiential Examinations of a Pervasive Social Behavior.”
B.S. is a form of persuasion that aims to impress the listener while employing a blatant disregard for the truth, the researchers explained. It can involve language, statistics and charts and appears everywhere from politics to science. This definition closely adheres to the one presented by the philosopher and Princeton emeritus professor Harry Frankfurt in his now-classic 2005 book “On Bullshit.” Dr. Frankfurt explored how B.S. is different than lying because liars know the truth and push it aside while B.S.ers don’t necessarily care about the truth at all.
Of course this isn’t new. But false information moves faster and farther these days, thanks to social media. A new study conducted by researchers at Massachusetts Institute of Technology, published earlier this month in the journal Science, analyzed the spread of 126,000 rumors tweeted by 3 million people over more than 10 years and found that false news spreads faster than truth. “We have reached epidemic levels of information pollution, and we need to do something about it,” says Jevin West, a professor of information science at the University of Washington. Dr. West co-created a class launched last year at the university, “Calling Bullshit,” that teaches students how to spot and refute the way data, such as statistics and charts, can be manipulated to make false arguments.
More than 60 schools have requested permission to use the materials to set up classes of their own, Dr. West says.
Some people spread false information unknowingly. But others simply don’t care if what they’re posting is untrue, Dr. West says, and pass along the information as a way to signal their views and values to their group. Philosophers call this tribal epistemology.
Website algorithms often favor salacious stories. (YouTube came under fire last month for the way its recommendation algorithm promotes conspiracy-theory videos aimed at viewers on both the left and the right.) And millions of bots—computer programs that can appear to be real people—also spread false information across the internet.
When do people typically use B.S? Two studies published online together this month in the Journal of Experimental Social Psychology show that people tend to spread it when they feel obligated to have an opinion about something that they know little about—and when they feel they aren’t going to be challenged on it. In one of the studies, students who were asked to write down their views on affirmative action, nuclear weapons and capital punishment admitted that half of what they wrote was B.S. But those who were told beforehand that they would have to defend these beliefs afterward to a sociology professor with an opposite view stuck to the truth.
“If you expect no one to challenge you on your opinion, you can B.S. it up all you like,” says John Petrocelli, a social psychologist and associate professor of psychology at Wake Forest University in Winston-Salem, N.C. “I call this the Ease of Passing Bullshit Hypothesis.”
Dr. Petrocelli’s research also shows that B.S. can help strengthen a weak argument when the speaker owns up to it. “You get a bump in your persuasion when you frame your weak argument as B.S., when you say ‘I don’t really care what the research shows,’ ” he says. But the opposite effect is true as well: B.S. will weaken a strong argument.
When are we most susceptible to believing B.S.? When we’re tired, research shows. But we are also more prone to believing misinformation when it comes from someone who shares our views. In not-yet-published research, Dr. Petrocelli took sentences from The New-Age Bullshit Generator—a website that creates quotes that sound profound but mean nothing—“Hidden meaning transforms unparalleled abstract beauty,” for instance—and attributed half to conservative political figures and half to liberal ones. When he showed these to people, they were more likely to rate the sentences as profound when they thought they came from a like-minded politician. “Basically, if you agreed with their attitude, it was great stuff, but if you didn’t, it was propaganda,” he says.
When people hear a false claim repeated even just once, they are more likely to let it override their prior knowledge on the subject and believe it, according to two studies published together in October 2015 in the Journal of Experimental Psychology. Psychologists say this is an example of the “illusory truth effect,” which shows that repeated statements are thought to be more true than statements heard for the first time.
“We think it happens because it is easier to process information the second time you hear it,” says Lisa Fazio, a psychologist and assistant professor of psychology and human development at Vanderbilt University, who was the lead author on the study. “It is also time-consuming and difficult to access our previous knowledge, so we often go with information that is close enough.”
Finally, we fall for B.S. more often when we fail to think analytically. In four studiespublished together in November 2015 in the journal Judgment and Decision Making, psychologists researched people’s vulnerability to what they call “pseudo-profound bullshit.” Subjects completed tests to measure their ability to think analytically and then viewed quotes from people such as poet T.S. Eliot as well as made-up sentences generated by two websites, The Wisdom of Chopra and The New-Age Bullshit Generator. The research showed that people who are more skeptical and analytical were less likely to find the B.S. quotes to be profound yet still likely to find meaning in the real quotes. Follow-up research showed that when the subjects were told ahead of time that some sentences were made up, they become more adept at spotting them, but analytical people still did this better.
“A lot of the problems with B.S. are because people are not bothering to think as much as they perhaps ought to,” says Gordon Pennycook, a postdoctoral fellow at Yale University who was the lead researcher on the study when he was a graduate student at the University of Waterloo.
HOW CAN YOU SPOT B.S.?
Check the source. Is this person an expert or in a position to know the information? Why is he or she telling me? What does the person have to gain? “Sometimes it’s just a coolness factor,” says Jevin West, a professor of information science at the University of Washington, who teaches a class in how to spot B.S. in data.
If it sounds too good to be true, it probably is. Remember that we all suffer from confirmation bias—we’re more likely to believe something that confirms what we already think or want. “It’s hardest to spot B.S. we agree with,” says Gordon Pennycook, a postgraduate fellow at Yale who studies B.S. “Question it if it supports your own beliefs.”
Ask questions. Research shows people are more likely to B.S. when they feel they can get away with it. “Ask them simply: ‘Why do you think that? How do you know that is true?’” says John Petrocelli, a social psychologist and associate professor of psychology at Wake Forest University in Winston-Salem, N.C., who studies B.S. “This will get them thinking critically.”
Don’t trust your gut. People who pause and think about whether information is true are better able to detect false information, research shows. “Rely on your prior knowledge,” says Lisa Fazio, an assistant professor of psychology and human development at Vanderbilt University.
Ask for evidence. This is different than an explanation, which people can continue to spin. Facts don’t lie—but check them to make sure they are real.
Pay attention to people who discount evidence. “I don’t care what the experts say” is a red flag that the person is using B.S.
Stay offline when you’re tired. Research shows we’re more vulnerable to false claims when our cognitive resources—that is, brain power—are depleted.
Fine-Tune Your B.S. Detector: You’ll Need It
The skill of spotting false information—rubbish, nonsense and, yes, fake news—is so important these days that scientists have begun serious research on it. They’re attempting to quantify when and why people spread it, who is susceptible to it, and how people can confront it.
This month in Atlanta, at the annual conference of the Society for Personality and Social Psychology, a group of psychologists and other scientists presented a symposium on their research. The title? “Bullshitting: Empirical and Experiential Examinations of a Pervasive Social Behavior.”
B.S. is a form of persuasion that aims to impress the listener while employing a blatant disregard for the truth, the researchers explained. It can involve language, statistics and charts and appears everywhere from politics to science. This definition closely adheres to the one presented by the philosopher and Princeton emeritus professor Harry Frankfurt in his now-classic 2005 book “On Bullshit.” Dr. Frankfurt explored how B.S. is different than lying because liars know the truth and push it aside while B.S.ers don’t necessarily care about the truth at all.
Of course this isn’t new. But false information moves faster and farther these days, thanks to social media. A new study conducted by researchers at Massachusetts Institute of Technology, published earlier this month in the journal Science, analyzed the spread of 126,000 rumors tweeted by 3 million people over more than 10 years and found that false news spreads faster than truth. “We have reached epidemic levels of information pollution, and we need to do something about it,” says Jevin West, a professor of information science at the University of Washington. Dr. West co-created a class launched last year at the university, “Calling Bullshit,” that teaches students how to spot and refute the way data, such as statistics and charts, can be manipulated to make false arguments.
More than 60 schools have requested permission to use the materials to set up classes of their own, Dr. West says.
Some people spread false information unknowingly. But others simply don’t care if what they’re posting is untrue, Dr. West says, and pass along the information as a way to signal their views and values to their group. Philosophers call this tribal epistemology.
Website algorithms often favor salacious stories. (YouTube came under fire last month for the way its recommendation algorithm promotes conspiracy-theory videos aimed at viewers on both the left and the right.) And millions of bots—computer programs that can appear to be real people—also spread false information across the internet.
When do people typically use B.S? Two studies published online together this month in the Journal of Experimental Social Psychology show that people tend to spread it when they feel obligated to have an opinion about something that they know little about—and when they feel they aren’t going to be challenged on it. In one of the studies, students who were asked to write down their views on affirmative action, nuclear weapons and capital punishment admitted that half of what they wrote was B.S. But those who were told beforehand that they would have to defend these beliefs afterward to a sociology professor with an opposite view stuck to the truth.
“If you expect no one to challenge you on your opinion, you can B.S. it up all you like,” says John Petrocelli, a social psychologist and associate professor of psychology at Wake Forest University in Winston-Salem, N.C. “I call this the Ease of Passing Bullshit Hypothesis.”
Dr. Petrocelli’s research also shows that B.S. can help strengthen a weak argument when the speaker owns up to it. “You get a bump in your persuasion when you frame your weak argument as B.S., when you say ‘I don’t really care what the research shows,’ ” he says. But the opposite effect is true as well: B.S. will weaken a strong argument.
When are we most susceptible to believing B.S.? When we’re tired, research shows. But we are also more prone to believing misinformation when it comes from someone who shares our views. In not-yet-published research, Dr. Petrocelli took sentences from The New-Age Bullshit Generator—a website that creates quotes that sound profound but mean nothing—“Hidden meaning transforms unparalleled abstract beauty,” for instance—and attributed half to conservative political figures and half to liberal ones. When he showed these to people, they were more likely to rate the sentences as profound when they thought they came from a like-minded politician. “Basically, if you agreed with their attitude, it was great stuff, but if you didn’t, it was propaganda,” he says.
When people hear a false claim repeated even just once, they are more likely to let it override their prior knowledge on the subject and believe it, according to two studies published together in October 2015 in the Journal of Experimental Psychology. Psychologists say this is an example of the “illusory truth effect,” which shows that repeated statements are thought to be more true than statements heard for the first time.
“We think it happens because it is easier to process information the second time you hear it,” says Lisa Fazio, a psychologist and assistant professor of psychology and human development at Vanderbilt University, who was the lead author on the study. “It is also time-consuming and difficult to access our previous knowledge, so we often go with information that is close enough.”
Finally, we fall for B.S. more often when we fail to think analytically. In four studiespublished together in November 2015 in the journal Judgment and Decision Making, psychologists researched people’s vulnerability to what they call “pseudo-profound bullshit.” Subjects completed tests to measure their ability to think analytically and then viewed quotes from people such as poet T.S. Eliot as well as made-up sentences generated by two websites, The Wisdom of Chopra and The New-Age Bullshit Generator. The research showed that people who are more skeptical and analytical were less likely to find the B.S. quotes to be profound yet still likely to find meaning in the real quotes. Follow-up research showed that when the subjects were told ahead of time that some sentences were made up, they become more adept at spotting them, but analytical people still did this better.
“A lot of the problems with B.S. are because people are not bothering to think as much as they perhaps ought to,” says Gordon Pennycook, a postdoctoral fellow at Yale University who was the lead researcher on the study when he was a graduate student at the University of Waterloo.
HOW CAN YOU SPOT B.S.?
Check the source. Is this person an expert or in a position to know the information? Why is he or she telling me? What does the person have to gain? “Sometimes it’s just a coolness factor,” says Jevin West, a professor of information science at the University of Washington, who teaches a class in how to spot B.S. in data.
If it sounds too good to be true, it probably is. Remember that we all suffer from confirmation bias—we’re more likely to believe something that confirms what we already think or want. “It’s hardest to spot B.S. we agree with,” says Gordon Pennycook, a postgraduate fellow at Yale who studies B.S. “Question it if it supports your own beliefs.”
Ask questions. Research shows people are more likely to B.S. when they feel they can get away with it. “Ask them simply: ‘Why do you think that? How do you know that is true?’” says John Petrocelli, a social psychologist and associate professor of psychology at Wake Forest University in Winston-Salem, N.C., who studies B.S. “This will get them thinking critically.”
Don’t trust your gut. People who pause and think about whether information is true are better able to detect false information, research shows. “Rely on your prior knowledge,” says Lisa Fazio, an assistant professor of psychology and human development at Vanderbilt University.
Ask for evidence. This is different than an explanation, which people can continue to spin. Facts don’t lie—but check them to make sure they are real.
Pay attention to people who discount evidence. “I don’t care what the experts say” is a red flag that the person is using B.S.
Stay offline when you’re tired. Research shows we’re more vulnerable to false claims when our cognitive resources—that is, brain power—are depleted.
Fine-Tune Your B.S. Detector: You’ll Need It