Although Facebook is well beyond seeing Twitter as a formidable rival, it still aims to learn a thing or two from what Twitter does best: real-time news. It’s an area that Facebook has tried to attack on a few different fronts over the years. For example, in 2016 the social media giant has focused on spending extensive resources to roll out and promote Facebook Live, its nuanced live video platform.

Facebook Live has the potential to succeed as both a viral-friendly spreader of cat videos and an up-to-the-moment source for developing news. So far, it has done more of the former than the latter (see: Chewbacca Mom).

Facebook Trending

Source: Facebook Help Center

It’s worth noting that there is still plenty of time for the more serious Facebook Live content creators to build large audiences, and some might already be showing signs of it.

Until then, Trending will play a major role in helping Facebook siphon users seeking real-time content away from Twitter. After all, Trending would appear uniquely equipped to do just that, with its vertical layout and auto-refreshing set of topics.

Well, not great.

Facebook first received widespread criticism for Trending back in the spring, as rumors circulated that the platform’s paid human curators were instructed to actively suppress political stories from sources that leaned to the conservative side.

As expected, criticisms from the political right were the loudest and most inflammatory. In the wake of the controversy, Mark Zuckerberg was forced to hold a face-saving meeting with GOP politicians in an attempt to assuage their concerns about bias in Trending. Aside from highlighting the young CEO’s growing political savvy, it’s unclear how much he really admitted was true. Either way the politicians left the meeting confident in his trustworthiness and the potential for Facebook as a connecting force for good in the world.

Inherent bias has become an unfortunate inevitability in American mainstream news. Biases exist on both sides of the spectrum, and the onus is on the public to parse through and cobble together an impartial version of events.

The problem with Facebook falling into this same trap is that, as a company, it went great lengths to present itself as an antidote to that problem, not Exhibit A of it. In Facebook’s own Help Center, Trending is described as a platform for:

Topics that have recently become popular on Facebook. Trending topics are based on factors including engagement, timelines, Pages you’ve liked, and your location. Our team is responsible for reviewing trending topics to ensure that they reflect real world events.

Mark Zuckerberg Facebook Trending

While the last sentence does admit some level of human curation, the overall message is clear: Trending is meant to show topics based on objective criteria. It seems very much like that promise was not kept.

One of the foundational pillars of Facebook’s success has been the idea that, as a social network, it evolves as its users do, reflecting our activity back to us in a package that facilitates healthy interaction. Facebook’s involvement in that process was meant to take the form of a benevolent chaperone, clamping down on hate speech and cyberbullying, but otherwise allowing the content to flow organically. If it was, in fact, suppressing political coverage from one side of the aisle, it was betraying those ideals in a major way.

The controversy didn’t build into any kind of existential threat to Facebook as a whole (it’s hard to imagine what that would look like, at this point) but it did raise legitimate questions about trust and bias.

An Equal And Opposite Problem

In response to the suppression controversy, Facebook opted to reduce the role of its hired human curators. This was to give off a greater sense of transparency and impartiality. If the stories are chosen strictly on an algorithmic basis, how could Facebook be blamed for any of the perspectives represented (or not represented)? Ideally, the stories are merely representative of the organic conversation occurring across the platform.

But there are inherent problems with an algorithm-based editorial system, as well. As WIRED points out,

How could Facebook, whose algorithms make billions of what are ultimately editorial decisions every day, ever possibly prove that it’s being fair? How could Facebook ever show all its work?

Before Facebook even troubles itself with solving the problem of “showing its work,” it would be wise to solve more immediate issues, like the basic functionality of the algorithm-based Trending.

Since the role of human curators was reduced, unsavory stories have been popping up. The algorithm has shown a tendency to misinterpret satire as earnest journalism, mislabel stories with keywords that aren’t the main topic, and accept tabloid-style articles from disreputable sources.

Facebook Trending Algorithm

One egregious example involved an untrue story claiming that Fox News host Megyn Kelly was fired for secretly supporting Hillary Clinton for the presidency. Trending’s algorithm pulled the story despite the fact that it came from, which is essentially a tabloid.

Responding to a request for comment from the Wall Street Journal, a Facebook spokeswoman said that “the story met standards because there was a sufficient number of articles about it.”

While it’s likely her response was an oversimplification – surely the algorithm is more complex than merely bean-counting how many mentions a story has received – it indicates that Facebook may be significantly further from solving the problem than they’d like to be.

What A Solution Might Look Like

Speaking of solving the problem, what would that even look like? Like it or not, Facebook has a responsibility to the large bodies of users on both sides of the political spectrum. The controversy around suppressing Conservative-leaning articles did it no favors in terms of building trust.

On the flip side, Facebook also loses trust when its algorithm gets fooled by satirical and/or illegitimate news sources.

The good news for the social media giant is that they have about a decade of practice riding that fine line between human subjectivity and algorithmic inflexibility in the growth and evolution of its News Feed.

Over the years, Zuckerberg has talked at length about how the goal of the News Feed is to deliver content to its users in order of relevancy. Each individual, through their likes, comments, and general engagement, builds up a profile for his/herself that Facebook’s algorithms internalize and cater to.

For Trending, Facebook will need to find a way to combine that penchant for automatic self-correction with a balanced range of news sources, so as not to become an echo chamber for users’ already-held beliefs.

It’s not an easy thing to achieve. But as long as Facebook fights to be the platform on which we get our real-time news, a large part of its credibility will be riding on pulling it off.