Post by ck4829 on Jan 24, 2017 13:04:08 GMT
Facebook was supposed to connect us to new ideas. Instead, sharing on the social platform may polarize us and lead to the dissemination of fake news.
Facebook users are creating and facilitating confirmation biases and bubbles, according to a new study out from the Proceedings of the National Academy of Sciences. The team of researchers analyzed the spread of two types of content: conspiracy theories and scientific information on Facebook.
“The massive diffusion of sociotechnical systems and microblogging platforms on the World Wide Web creates a direct path from producers to consumers of content allows disintermediation and changes the way users become informed, debate and form their opinions,” the researchers wrote in their abstract.
The researchers went on to note that this environment can foster confusion and encourage “speculation, rumors and mistrust.”
One of the researchers pointed out that "confirmation bias" is actually one of the main motivations for sharing content on the platform.
"Our analysis showed that two well-shaped, highly segregated, and mostly non-interacting communities exist around scientific and conspiracy-like topics," paper co-author Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California, told CNN. "Users show a tendency to search for, interpret, and recall information that confirm their pre-existing beliefs."
The study may shed some light on Facebook’s recent “fake news” problem. Since the election, Facebook has received harsh criticism for allowing fake news stories to circulate on the site, with some saying the fake stories that encouraged confirmation bias influenced the outcome of the U.S. presidential election. According to BuzzFeed, top fake election news stories saw more engagement on Facebook than top stories from the major media outlets during the final three months of the election.
So what’s being done about the problem? Currently, Facebook is working to make it simpler for people to report misinformation. The Menlo Park-based company is partnering with fact-checking organizations, showing warnings on stories reported as fake and vetting stories appearing in related articles section to make it more difficult for people to profit off fake news.
www.bizjournals.com/sanjose/news/2017/01/23/facebook-could-be-polarizing-us.html
Facebook users are creating and facilitating confirmation biases and bubbles, according to a new study out from the Proceedings of the National Academy of Sciences. The team of researchers analyzed the spread of two types of content: conspiracy theories and scientific information on Facebook.
“The massive diffusion of sociotechnical systems and microblogging platforms on the World Wide Web creates a direct path from producers to consumers of content allows disintermediation and changes the way users become informed, debate and form their opinions,” the researchers wrote in their abstract.
The researchers went on to note that this environment can foster confusion and encourage “speculation, rumors and mistrust.”
One of the researchers pointed out that "confirmation bias" is actually one of the main motivations for sharing content on the platform.
"Our analysis showed that two well-shaped, highly segregated, and mostly non-interacting communities exist around scientific and conspiracy-like topics," paper co-author Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California, told CNN. "Users show a tendency to search for, interpret, and recall information that confirm their pre-existing beliefs."
The study may shed some light on Facebook’s recent “fake news” problem. Since the election, Facebook has received harsh criticism for allowing fake news stories to circulate on the site, with some saying the fake stories that encouraged confirmation bias influenced the outcome of the U.S. presidential election. According to BuzzFeed, top fake election news stories saw more engagement on Facebook than top stories from the major media outlets during the final three months of the election.
So what’s being done about the problem? Currently, Facebook is working to make it simpler for people to report misinformation. The Menlo Park-based company is partnering with fact-checking organizations, showing warnings on stories reported as fake and vetting stories appearing in related articles section to make it more difficult for people to profit off fake news.
www.bizjournals.com/sanjose/news/2017/01/23/facebook-could-be-polarizing-us.html