A team of Facebook data analysts has recently published the results of their research on users’ political affiliation and the alleged filter bubble the social media giant is creating through its algorithms.
Exposure to ideologically diverse news and opinion was published in the Science journal, after a close peer review by scientists non-affiliated to Facebook.
The research was conducted on a pool of 10.1 million users with declared political affiliation on their pages. The main requirement that needed to be met by the large pool of users to be included in the study was for them to have navigated the site over a six-month period in 2014. At the same time, all users had to have logged in at least 4 out of 7 days of the week. In addition, during these days, it was expected that they have clicked at least one political news link during the last six months of 2014.
The results of the study, although contested by some, bear a great significance to how social media shapes preferences and the political debate.
Firstly, the idea behind the study was to dissipate the accusation that Facebook is creating a filter-bubble through the way its algorithms work. It became clear that while the filters do have an effect on what the users visualize, their individual choice is just as important. The network of friends the individual user has is prone to contain an average 23 percent people with opposing political views. This sparks a 29 percent chance that what the users see in Facebook’s newsfeed conflicts with their own affiliation.
Eytan Bakshy, one of the data scientists in Facebook’s team commented that:
“You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here”.
So, the flattening and polarizing of the political debate is not happening. That comes to ease the mind of political science scholars who have been long debating on the effects of social media over democracy and a balanced political debate. The greatest fear that was expressed wad that of a creation of an echo chamber which is synonymous to the polarization of political debate, leaving no room for interaction or clashing with opposing views.
According to the study, that is not the case. In the pool of 10.1 million users observed by Facebook’s data scientists team, the liberals showed more signs of reluctance towards conservative friends and posts, while the conservatives appeared to be more open to exposure. 22 percent of the stories featuring on the Facebook newsfeed of the liberal camp are conservative. On the other hand, 33 percent of news presented to conservative users are of a liberal bent. The likelihood that liberals would click on a story that is opposing their views was calculated at 6 percent. In the conservative camp, the chances for users to click on ideologically challenging news were calculated at 17 percent.
Therefore, the filter bubble is real entailing that what you liked, clicked, or commented on Facebook will create a bias towards the subjects each user is more attracted to. At the same time, those are not the only factors to be taken into account. If you feel your news feed is an echo chamber, find more friends as their preferences and declared interests, particularly political affiliation create a higher chance of exposure to larger variety of news.
Image Source: /australianfacebooklikes