 |
 |
Facebook data scientists published a study in Science magazine saying that the social network doesn´t completely isolate its users from different political viewpoints
In an effort to explore how people consume news shared by friends of different ideological leanings, Facebook’s researchers pored over millions of URLs shared by its U.S.-based users who identify themselves in their profiles as politically liberal or conservative. The work, which sheds more light on how we glean information from our ever-growing, technologically enhanced tangles of social connections, was published in a paper in Science
Eytan Bakshy, a research scientist on Facebook’s data science team and coauthor of the paper, says the group found that Facebook’s News Feed algorithm only slightly decreases users’ exposure to news shared by those with opposing viewpoints.
The work comes more than three years after Bakshy and other researchers concluded that while you’re more likely to look at and share information with your closest connections, most of the information you get on Facebook stems from the web of people you’re weakly connected to—refuting the idea that online social networks create “filter bubbles” limiting what we see to what we want to see (see “What Facebook Knows”).
However, Bakshy says, the previous research, published in 2012, didn’t directly measure the extent to which you’re exposed to information from people whose ideological viewpoints are opposite from yours.
In an effort to sort that out, researchers looked at anonymized data for 10.1 million Facebook users who define themselves as liberal or conservative, and seven million URLs for news stories shared on Facebook from July 7 to January 7. After using software to identify URLs that consisted of “hard” news stories (pieces focused on topics like national news and politics) that were shared by a minimum of 20 users who had a listed political affiliation, researchers labeled each story as being aligned with liberal, neutral, or conservative ideologies, depending on the average political leaning of those who shared the stories.
Researchers found that 24 percent of the “hard” stories that liberal Facebook users’ friends shared were aligned with conservative users, while 35 percent of the “hard” stories that conservative Facebook users’ friends shared were aligned with liberal users—an average of 29.5 percent exposure, overall, to content from the other side of the political spectrum.
The researchers also looked at the impact of Facebook’s News Feed ranking algorithm on the kind of news you see. Bakshy says that overall, the algorithm reduces users’ exposure to content from friends who have opposing viewpoints by less than 1 percentage point—from 29.5 percent to 28.9 percent.
And when it came down to what users ended up actually reading, researchers report that conservatives were 17 percent less likely to click on liberally aligned articles than other “hard” stories in their news feeds, while liberals were 6 percent less likely to click on conservatively aligned articles presented to them.
|
 |
 |