Is Facebook an echo chamber? If so, they're OK with that

How Facebook trending works
How Facebook trending works

If you're only seeing stories in your Facebook News Feed about Republican politics or progressive movements, that's because of the friends you have and the posts you comment on.

It's a bubble that is now encircling many of us on social media, keeping us from seeing anything with which we disagree, and it has many observers and experts worried about its potential effects on society.

But Facebook feels no obligation to burst your bubble.

The company's executives say they feel no responsibility to algorithmically adjust your News Feed to show you opposing points of view.

Their responsibility, Facebook VP Adam Mosseri told me earlier this week, is to "make sure you see stories that you're interested in."

This week the company published its "News Feed Values" for the first time, and its #1 value is "keeping you connected to the people, places and things you want to be connected to."

Key word: "Want."

Facebook wants to give you more of what you want so that you'll spend more time on Facebook.

Related: Facebook's latest News Feed change may hurt publishers

This issue fascinates me as both a consumer and producer of news. It goes by various names. Author Eli Pariser wrote a whole book about the "filter bubble" a few years ago. He said the Internet's personalization features create "a unique universe of information for each of us" that can distort "our perception of what's important, true and real."

Others call this concept a silo or a digital echo chamber. The same Internet that makes it so easy to access a head-spinning variety of viewpoints also makes it easy to wall yourself off and only hear things you agree with.

Some critics blame Facebook for this, along with fellow tech giants like Google and Twitter. They say the companies have a civic duty to pay attention to this issue and to adjust their algorithms accordingly.

The conversations about Donald Trump and Hillary Clinton really seem to be happening on separate planets. Many of the stories from partisan news sources are misleading -- some are downright wrong. (The Wall Street Journal recently visualized the issue with "Blue Feed, Red Feed," a tool that shows the vast differences between the liberal and conservative worlds on Facebook.)

When I'm asked about this issue, typically at wonky panel discussions, I say that I'd like Facebook to show me and you -- every user -- MANY points of view. If your News Feed displays 5 liberal-leaning stories in a row, because most of your friends are sharing stories from The Nation and Salon, I think the algorithm should force a few conservative-leaning stories into the mix. And vice versa.

This is, admittedly, way easier said than done. It is also an editorial stance — and Facebook has an allergy to anything that smacks of human editing.

Last month's controversy over alleged liberal bias in the site's trending topics box has heightened the company's sensitivity about this. Anonymous former workers said they witnessed conservative news stories being suppressed from the box. Facebook was unable to verify the claims, but the uproar showed how intensely the company is scrutinized.

But it is a pressing issue -- one that was brought up again this week in the wake of the Brexit vote in the United Kingdom. Several commentators said the "filter bubble" stoked support for the leave campaign and failed to present voters with a balanced set of facts.

British entrepreneur and writer Tom Steinberg wrote on Facebook after the result, "I am actively searching through Facebook for people celebrating the Brexit leave victory, but the filter bubble is SO strong, and extends SO far into things like Facebook's custom search that I can't find anyone who is happy *despite the fact that over half the country is clearly jubilant today* and despite the fact that I'm *actively* looking to hear what they are saying."

He cited this experience to urge Facebook CEO Mark Zuckerberg and other tech leaders to "do something about this."

When I interviewed Mosseri a few days later, he was clear about the company's view.

"It's not our place to decide what people should read" or to push a particular political view, he said.

What about at least surfacing an intentionally wide assortment of stories?

"That would be stepping into an editorial role," Mosseri said.

When I tested various scenarios on him, Mosseri asked, "How would it scale?" In other words, even if Facebook wanted to take steps in this direction, how would the algorithms account for political ideologies and parties around the world?

He did say, though, that "we take this seriously" and "we continue to study it."

The company conducted a study of American users who publicly list their political affiliation. It showed that, on average, 25% of their friends list a different political affiliation. This suggests that the users are being exposed to some other points of view.

Describing a hypothetical Democratic uncle, Mosseri said, "we know he sees posts from Republican friends. We know he sees posts from his friends from publications that he doesn't follow yet."

But if this uncle only likes and comments on anti-GOP posts, he's going to see fewer and fewer pro-GOP posts over time.

Facebook's newest changes to the News Feed put a greater emphasis on posts from friends, at the expense of posts from branded pages from news web sites and the like, which may exacerbate the issue.

Bottom line, Mosseri said: "I think that what happens on Facebook is largely reflective of what happens in the real world."

But if you're waiting for Facebook to take steps toward a less polarized world, well, keep waiting.

"They have enshrined the individual user's choices as a more important filter factor than anything like 'exposure to mixed points of view,' or a 'rounded sense of the debate,'" New York University journalism professor Jay Rosen said. "That's just another example of how Facebook is taking over the cultural territory journalists once held and bringing different priorities to it."

Rosen also noted Facebook's renewed emphasis on giving users controls to customize their feeds. "If your liking is 'lots of different points of view' there should — in theory — be controls that allow for that," he said.

Newsletter

CNNMoney Sponsors