A study commissioned by Facebook shows that people who use it tend to avoid hearing any opinions that contradict their own. The study is called, “Exposure to ideologically diverse news and opinion on Facebook” and was published in Science Magazine.
[Read the entire report here.]
Will you offer us a hand? Every gift, regardless of size, fuels our future.
Your critical contribution enables us to maintain our independence from shareholders or wealthy owners, allowing us to keep up reporting without bias. It means we can continue to make Jewish Business News available to everyone.
You can support us for as little as $1 via PayPal at [email protected].
Thank you.
So what else is new?
Have you ever noticed that the news feeds which you see on Facebook seem to always say similar things and deal with the same subject matters over and over again? This should not surprise you.
This is just like what happens with Google searches. You should have noticed by now that when you type in a word or part of a name in one, Google offers to fill in the rest for you, giving a couple of suggestions. This also happens when typing in a web address into Google Chrome. Type just “Jewish busini” and Chrome will fill in the rest for you.
This is a convenience that most people appreciate, but some feel is invasive. It happens because Google remembers what you have searched for in the past and its algorithms take care of the rest.
The study states that, “Exposure to news and civic information is increasingly mediated by online social networks and personalization.” In other words our use of on line social networking may actually be limiting our exposure to new things, rather than expanding it.
“Information abundance provides individuals with an unprecedented number of options, shifting the function of curating content from newsroom editorial boards to individuals, their social networks, and manual or algorithmic information sorting. While these technologies have the potential to expose individuals to more diverse viewpoints, they also have the potential to limit exposure to attitude-challenging information, ” says the study.
So the opposite of what we have hoped for is what is actually occurring.
The study examined how 10.1 million U.S. Facebook users interact with socially shared news. It also examined the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. It then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked News Feed, and further studied users’ choices to click through to ideologically discordant content. Compared to algorithmic ranking, individuals’ choices about what to consume had a stronger effect limiting exposure to cross-cutting content.