Mark Zuckerberg’s manifesto for tackling inappropriate content

November 19, 2016 - Lima, Lima, Per - Founder and CEO of Facebook Mark Zuckerberg participates in the APEC CEO Summit in the frame of the Asia Pacific Economic Cooperation Forum (APEC) on November 19 in Lima, Peru. (Ernesto Arias/EFE/Zuma Press/TNS)

November 19, 2016 - Lima, Lima, Per - Founder and CEO of Facebook Mark Zuckerberg participates in the APEC CEO Summit in the frame of the Asia Pacific Economic Cooperation Forum (APEC) on November 19 in Lima, Peru. (Ernesto Arias/EFE/Zuma Press/TNS)

Facebook landed in hot water last year for taking down an iconic Vietnam War photo showing a naked girl running from a napalm attack and deactivating the social media accounts of Korryn Gaines, who was fatally shot by police officers during a standoff near Baltimore.

But making calls about what content to take down or leave up aren’t always black-and-white decisions for the social media giant. A person’s view on nudity can differ depending on whether they’re in Europe versus the Middle East or Asia.

Now Facebook CEO Mark Zuckerberg is thinking about giving users more control over what content they find offensive or inappropriate.

“The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings,” Zuckerberg wrote in a 5,800-word letter that laid out the company’s global ambitions. “We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum.”

Facebook would only take down content “if it is more objectionable than the most permissive options allow.”

“Within that range, content should simply not be shown to anyone whose personal controls suggest they would not want to see it, or at least they should see a warning first,” he added. “Although we will still block content based on standards and local laws, our hope is that this system of personal controls and democratic referenda should minimize restrictions on what we can share.”

With nearly 2 billion users around the world, Zuckerberg said it’s become less feasible to have one set of rules for its entire community. And with the company’s push into live video, some of these decisions are getting tougher to control as users broadcast shootings and even suicides in real time.

“Just as it’s a bad experience to see objectionable content, it’s also a terrible experience to be told we can’t share something we feel is important,” he wrote. “This suggests we need to evolve towards a system of personal control over our experience.”

This new system the company is envisioning isn’t fully developed yet, he noted.

The tech firm has admitted before that it made mistakes in how it enforced its community standards. The company, for example, reversed its decision to take down the Napalm girl photo after pulling it down for violating its rules on child nudity.

“This has been painful for me because I often agree with those criticizing us that we’re making mistakes,” he wrote. “These mistakes are almost never because we hold ideological positions at odds with the community, but instead are operational scaling issues.”