Daily Blog Email
A story broke in the last couple of days about Facebook’s curation of, and possible conservative-repression within, its “Trending” sidebar. A number of former Facebook employees spoke anonymously with tech site Gizmodo:
Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.
Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.
However, as my friend Marty Duren pointed out in his blog post downplaying the curation of the Trending feed, other anonymous former Facebook content curators denied these claims in CNNMoney:
Three former Facebook workers who spoke with CNNMoney echoed what some of Gizmodo’s other sources said — that personal biases might creep into the day-to-day “trending” work, but they never detected institutional bias for or against conservative information.
To the contrary, “the guidance from management was that it should be objectively representing what people are talking about, regardless of political views,” one of the sources told CNNMoney.
Another source, a former trending topic curator, said the teams prized transparency and open communication. While some stories that were trending across Facebook were blocked from appearing in the box, the intent was to weed out spam and other objectionable content.
In what could hardly be called the most shocking PR move of the year, Facebook has denied any claims that it curates its Trending column. Because of course it does. Did anyone expect Facebook to come out and admit that yes, occasionally, it blocks out content it may not like? No, no one did. That would be very un-Facebook-like.
In short: we don’t know. Could the Gizmodo sources who claim that Facebook blocks content it doesn’t like, conservative content or content that reflects poorly upon Facebook, just be a bunch of disgruntled former employees who want to get back at the company that they left? Yes, this could be the case.
Along the same lines, could the CNNMoney sources who claim Facebook does no such thing be loyalist former employees who, perhaps, left the company on better terms and would prefer not throw it under the bus? Yes, this also could be the case.
Is Mark Zuckerberg an internet dictator bent on creating a fake reality in order to brainwash the billions of people on Facebook? Possibly—I mean have you seen The Social Network?—but probably not.
Until we have some sort of hard evidence proof to support claims of blocking or not blocking content, we can’t really say. But, as I posted on Facebook the other day, I’ve long thought Facebook curates this Trending column, and here’s the bottom line:
Facebook DOES curate the trending column, even if it’s not BLOCKING content, and that’s BAD.
Seth Godin, Gary Vaynerchuk, and plenty of others who study internet trends say that the two most valuable assets today, especially when looking at social media, are these:
Why is Facebook the most powerful website in the universe? Because it attracts more attention than anything else, and people trust what is on it—primarily because they trust their friends who share the content on it.
Even if Facebook is not doing what the Gizmodo sources claim; even if it is not suppressing certain political content or stories that make Facebook look bad, it does curate the content, and that’s a problem. But why? Aren’t all news outlets biased in what they do or do not report? Of course, but here’s the problem:
Facebook claims that the topics in the Trending column are the most popular topics on Facebook, and when they slide in local stories or stories that may interest you, which they do, they are no longer delivering what they claim to be delivering.
I’ve been suspicious of Facebook’s Trending column since the beginning because I know that Facebook will do just about anything to 1) make more money and 2) tell its users what they want to hear. So, since the column was introduced in 2014, I’ve collected a few examples of when Facebook was delivering me content that it either 1) wanted to promote and almost certainly wasn’t “trending” or 2) thought I would want to see. Here are two such examples:
1) Mark Driscoll Resigns
Mark Driscoll resigning his post at Mars Hill Church was a big deal, but it was not a nationally-Facebook-trending big deal. Because of my interest in Driscoll and my work, Facebook almost certainly delivered that to my Trending column because it knew I would be interested in it or because it was trending among my friends or my friends’ friends.
2) Salesforce.com Introduces New Product
Ok, you could maybe make the case that the Driscoll news was nationally trending, though I would disagree. You absolutely cannot make the case that Salesforce.com’s new data analytics program was trending nationally on Facebook. However, I can show you that Facebook is cozy with Salesforce. Adweek reported on October 12, 2014, one day before that screenshot above:
Facebook is laying the infrastructure for mobile advertising in much the same way Google built its business on top of desktop more than a decade ago. The social network recently launched an ad server on top of an ad network while it also develops a video ad platform from its purchase of Live Rail. It’s an ad stack that is bolstered by Facebook’s unique position of having a massive 1.3 billion member user base—from which it can glean massive amounts of valuable knowledge and data.
However, Facebook chose to make that info useful to a select number of preferred marketing partners at the late September launch of the Atlas ad server. Present were big names like Omnicom Media Group, SalesForce and SHIFT, all of whom appeared to be in a more special position than other longtime partners. They’ve been given first dibs on connecting to the new Atlas ad server, allowing them to take advantage of Facebook’s data to target and measure their ad campaigns for brands. Meanwhile, a number of other advertising technology partners were introduced to Atlas only after launch.
I tweeted this at the time:
No, Facebook. This is not trending, as much as you love Salesforce. pic.twitter.com/0oXt2Tfyj2
— Chris Martin (@ChrisMartin17) October 13, 2014
Naturally, Salesforce’s new data product, Wave, would be important to Facebook for a number of reasons: the Wave announcement was made in San Francisco, and Facebook has interest in the product. Why not throw their friends a bone and make it look like their new product is trending on Facebook?
I can think of probably a dozen times the Trending column has delivered me Nashville-or-Tennessee-specific news that would not have been trending on Facebook, but I did not snag any screenshots of such occurrences—you’ll just have to trust me, they pop up from time to time. If I see one, I’ll screenshot it and add it here. My wife watches the Trending column religiously, and usually I would notice a local story when she would text me, “Did you see __Nashville event__ is trending on Facebook?”
In sum, Facebook may not be squelching conservative news outlets in some sort of witch hunt, but I think it’s safe to say this:
What Facebook says is “Trending” may not actually be trending, but may be what Facebook wants to make you think is trending.
This presents many problems. Here are three:
I tweeted about a year ago:
I think in 3-5 years, we are going to look back at the Facebook "Trending" sidebar and realize its MASSIVE influence. Everyone reads it.
— Chris Martin (@ChrisMartin17) March 31, 2015
And it’s true—I stand by that claim. The Trending column on Facebook is perhaps the most valuable inch of real estate on the internet. Billions of people have the ability to see that column, and when its content is determined by a group of people at Facebook rather than by what is getting the most shares, that group of people wields a tremendous amount of power.
My friend, Marty Duren, who wrote that this curated column matters little, is wrong. He writes:
Frankly, if you’ve been using Facebook’s trending stories as your main source of news, your biggest problem isn’t that Facebook secretly manipulated the trending stories and may have misled you about what stories were actually trending. It’s that you’ve been using Facebook’s trending stories as your main source of news.
I agree with Marty: Facebook’s Trending column should not be your main source of news. However, I think Marty is wrong in his assessment because, while it is silly to use Facebook as your main source of news, most of us do, which makes this a matter of great, not little, importance.
In June of 2015, Pew Research Center released data about how different generations of Americans consume news. Guess which outlet came out on top for everyone but Boomers.
When 61% of the largest generation in the history of the country is consuming news via a single platform, it matters what that platform says is or is not trending.
People consume news via Facebook, and while that may be silly, it is nonetheless a reality, and makes the content of the Trending column much more important than what is talked about on the nightly news. Because let’s be honest, much of what is talked about on TV news today is sourced from what is trending on Facebook or other social media platforms.
As demonstrated above, Facebook has the ability to decide what it wants to be presented as “popular” even if millions of people are not talking about it on Facebook. Even if Facebook isn’t suppressing views with which it disagrees, I find it problematic that it can promote the obscure release of an analytics product (like Salesforce’s) and act like that’s “popular.”
The main problem lies here: Facebook presents its trending column as a user-generated, organically-curated selection of content, when, in fact, it is not. The minute Facebook curates a column of content it claims is “trending” it deceives its users.
I’ve written at length about the problems with the Facebook echo chamber. Basically, it is in Facebook’s best interest to tell you what you want to hear, so it is going to deliver you content based on what you like. Unless you are diligent about following people with whom you disagree on Facebook, the content that fills your newsfeed will primarily affirm your views and encourage you in them. This keeps you using Facebook, and helps Facebook make more money.
Here’s an example from this morning. I was searching some video game news yesterday, particularly about a game called “League of Legends.” Check this out from my Trending column this morning:
I have a feeling neither League of Legends nor Donkey Kong are trending for most of you. But, Facebook knows I might be interested in that content, so they deliver it to me and call it “Trending.”
I wrote last June:
Facebook wants you to know that you’ve only got yourself to blame for the lack of diversity in views on your News Feed. The social network has recently conducted a study to find out why people mostly see posts that mirror their own beliefs and to find out if a “filter bubble” is to blame. “Filter bubble” is what you call the situation wherein a website’s algorithm shows only posts based on what you clicked (or Liked) and commented on. For this particular study, the company used anonymous data from 10.1 million Facebook users who list their political affiliations on their profiles. Researchers monitored “hard news” links posted on the website and looked at whether they were posted by conservatives, liberals or moderates.
Facebook’s echo chamber, along with the popular “Trending” sidebar launched last winter allows Facebook an inordinate amount of power to decide what is and is not news.
My wife and I joke about this regularly because she has never been one to care much about the news, specifically when it comes to politics. I am fascinated with the news and am constantly aware of what’s going on. BUT, if Facebook has it in their sidebar, she knows about it. Remarkable.
The question we have to ask, and likely the question you’re asking is, “Why does this even matter?”
That’s a great question. Let’s answer it.
The Facebook Echo Chamber matters because if Millennials are primarily consuming news via Facebook, and Facebook only serves us content we like, we’re only ever going to be seeing content we like, and that’s a problem.
Facebook filtering, especially when it comes to the consumption of political views as explained above, promotes further ideological polarization, which is a horrible idea.
The ability to decide what is or isn’t popular is problematic. Whether we are creating the echo chamber for ourselves or Facebook is creating it for us, it is unhelpful to present content as “Trending” when, in fact, it is not. Consuming news via Facebook is unwise, but most of us are doing it, and the more Facebook has the ability to decide what we do or do not see, the bigger problem this will become.