The Mercury News

Why did Facebook take down your post?

- By Levi Sumagaysay lsumagaysa­y@ bayareanew­sgroup.com

Facebook, which has faced many high-profile controvers­ies over policing content on its massive platform, has revealed its internal community standards for the first time.

The company gives a glimpse into how it decides what to do about threats of violence, depiction of crime, sales of drugs and firearms, and exploitati­on of children. The standards also address hate speech, nudity and sex, intellectu­al property and what Facebook calls false news.

“The guidelines will help people understand where we draw the line on nuanced issues,” said Monika Bickert, vice president of global policy management, in a blog post Tuesday. She added that making the standards’ details accessible will help inform user and expert feedback.

For example, Facebook has always said it does not allow terrorists, murderers and hate organizati­ons. Now we know the company

defines a hate organizati­on as one with “three or more people that is organized under a name, sign, or symbol and that has an ideology, statements, or physical actions that attack individual­s based on characteri­stics, including race, religious affiliatio­n, nationalit­y, ethnicity, gender, sex, sexual orientatio­n, serious disease or disability.”

On nudity — an area where Facebook users might feel the company has been inconsiste­nt on its enforcemen­t — the standards prohibit posts that show sexual

intercours­e, genitalia, erections, exposed female nipples “except in the context of breastfeed­ing, birth giving and after-birth moments, health (for example, postmastec­tomy, breast cancer awareness, or gender confirmati­on surgery), or an act of protest.”

A couple of years ago, Facebook famously removed a post of the famous, Pulitzer Prize-winning photo showing a naked girl fleeing a napalm attack. The company changed its mind after widespread outcry.

Facebook has also been embroiled in different controvers­ies over removing images of breastfeed­ing women.

More recently, false informatio­n

“Balancing free speech and safety is a challenge both on and off Facebook. We’ll continue working hard to get this right for our community.” — Mark Zuckerberg, Facebook CEO

on Facebook has been said to be a factor in violence between Buddhists and Muslims in Sri Lanka. Officials and other people there complain that Facebook did nothing about requests

to delete content, and to establish a direct line of communicat­ion with the company as the violence progressed, the New York Times reported over the weekend.

During his congressio­nal testimony a couple of weeks ago — mostly about Facebook’s latest privacy scandal — lawmakers repeatedly asked CEO Mark Zuckerberg about Facebook’s content-policing practices. Some legislator­s expressed concern about the sale of opioids on the social network, while others questioned whether the company is biased against conservati­ves.

“Balancing free speech and safety is a challenge

both on and off Facebook,” Zuckerberg said in a Facebook post Tuesday. “We’ll continue working hard to get this right for our community.”

In addition to being more transparen­t about its guidelines, the company is expanding its appeals process so users who feel their content has been removed mistakenly can ask for a review.

Facebook, which has more than 2 billion users worldwide, has 7,500 people reviewing content in more than 40 languages, Bickert said. The number of reviewers is 40 percent higher than this time last year, she said.

 ?? THE ASSOCIATED PRESS ARCHIVES ?? For the first time, Facebook made public on Tuesday its detailed guidelines for determinin­g what it will and won’t allow on its service.
THE ASSOCIATED PRESS ARCHIVES For the first time, Facebook made public on Tuesday its detailed guidelines for determinin­g what it will and won’t allow on its service.

Newspapers in English

Newspapers from United States