FILE - In this June 11, 2014, file photo, a man walks past a mural in an office on the Facebook campus in Menlo Park, Calif. On Thursday, May 12, 2016, Facebook pulled back the curtain on how its Trending Topics feature works, a reaction to a report that suggested Facebook downplays conservative news subjects. (AP Photo/Jeff Chiu, File)

I’ve Got Concerns With Facebook’s Content Governance Plan

It is pretty clear Facebook and, specifically, Mark Zuckerberg have put a lot of thought into what to do with their content. As they have gotten so big, they have become an easy target for critics. Facebook is dominant in social media and, frankly, a lot of media outlets abdicated their traffic growth to Facebook. When that did not pan out, they’ve decided to nurse their grievances against Facebook with regular and sustained attacks on the platform. Many a journalist who suffered a “pivot to video” has an axe to grind and we should not forget that this shapes a lot of media coverage towards Facebook these days. Likewise, a lot of what happens on and with social media is so unknown that people fear it.

But it is clear as well that Facebook is grappling with a bunch of unknowns related to its content and how to patrol it, as Mark Zuckerberg himself outlines in a pretty lengthy piece about the steps the platform will take moving forward. You can read it here.

Essentially, Facebook is working with various countries on regulation of the platform; intends to create an independent third party group to make content decisions for it; and intends to use algorithms to degrade content that is acceptable under Facebook’s terms of service, but that approach the line of being unacceptable. I put them in that order because I want to deal with them in that way.

First, there are plenty of countries that do not value our first amendment and Facebook has to work with those countries. I think it should be applauded for pro-actively seeking to work with countries like France to find a favorable solution to that country’s concerns. That, however, leads me to the second issue.

Facebook intends to create  an independent third party that some are already referring to as the “Facebook Supreme Court.” From Zuckerberg’s post:

In the next year, we’re planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding. The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe.

I believe independence is important for a few reasons. First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.

These are uncharted waters and I have a lot of concerns. Will there be one group or one group per country? The United States is pretty unique these days in the liberality of its speech protections as we have freedom of speech enshrined in our constitution. One need look no further than the United Kingdom or even Canada to see the police power of the state being imposed on people merely for posting something on Facebook.

I think I would have to immediately delete my account if I knew a Saudi or Chinese national was given the final say in what an American could post to Facebook.

As to the overarching issue, given the ongoing media storm against Facebook, I do not blame it for abdicating its internal roles on content policing to an external group. But if it is abdicating to those from nations and regimes that do not protect free speech, count me out. I highly encourage the organization to have country specific groups.

The other concern I have with Facebook’s plan is the idea of downgrading content that does not violate Facebook’s standards, but comes close. From Zuckerberg’s post:

Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content. This is a basic incentive problem that we can address by penalizing borderline content so it gets less distribution and engagement.

Facebook is already routinely accused of shadow banning individuals and groups, and a lot of them conservatives. This is only going to escalate that and looks like it could be an unforced error. The company wants to design an algorithm that is as unbiased as possible to do it, but I think this is asking for trouble.

The thing I note is that Zuckerberg writes repeatedly that people engage with this sort of content, even though they don’t like it. Perhaps instead of using an algorithm to downgrade that content so it gets less traction, make it easier for people to make the content disappear on their own. Instead of actively using an algorithm to hide content, create an algorithm that lets people choose to hide incendiary content being shared by or created by their friends and family. We all have that crazy friend who pushes conspiracies. Give me an easy to find and visible button that lets me decide I don’t want to see Uncle George’s crazy ass conspiracies or incendiary political posts.

Heck, perhaps add a bomb button to posts. When you see something that offends you, make the user’s first reaction be to press the button with the bomb and blow up the post from their feed instead of incentivizing comments and flame wars. Let the user who is engaging with incendiary content because they want to push back feel like they’ve accomplished something more meaningful by eradicating the post from their timeline.

That way Facebook is providing me a tool that I can use without Facebook making the choice. Facebook would be empowering its users and letting us maintain relationships with friends and families who have views we prefer not to see. Facebook would be allowing us some measure of control on a platform where increasingly people feel like they lack control.

That, ultimately, is what this is all about. Facebook wants to maintain a healthy community. But as social media allows and encourages oversharing, we all see aspects of friends and family we get to ignore in the real world. We feel like we lose control. Help us ignore what we think is garbage online by letting us choose to ignore it, instead of choosing for us.

Facebook is in a difficult position. Angry reporters with axes to grind after failed pivots to video want to bring down the company. Facebook internally continues to have numerous employee problems that cause conservatives to have legitimate distrust of the organization (see here and here for example). Progressives are angry that conservative groups are involved in content review at Facebook. The company is, in large part, a victim of its own success.

Instead of trying to fix problems, Facebook should let us fix problems for ourselves. That way it can provide a solution while also placing responsibility with its users as it should. The alternative is going to be more complaining from more people, more distrust of how it regulates content, and more bad news. Give users the tools to clean up what they see and let it be on them if they don’t use those tools. But don’t rely on people without a sense of free speech to regulate speech and don’t rely on human designed algorithms to shadow ban or degrade content when the humans designing the algorithms get their panties in a wad when a Facebook employee, on his own time, shows support for a friend at a Supreme Court confirmation hearing.

Disclosure: I am a Facebook shareholder.

About the author

Erick Erickson

View all posts