Why did Facebook take down your post?
Facebook, which has faced many high-profile controversies over policing content on its massive platform, has revealed its internal community standards for the first time.
The company gives a glimpse into how it decides what to do about threats of violence, depiction of crime, sales of drugs and firearms, and exploitation of children. The standards also address hate speech, nudity and sex, intellectual property and what Facebook calls false news.
“The guidelines will help people understand where we draw the line on nuanced issues,” said Monika Bickert, vice president of global policy management, in a blog post Tuesday. She added that making the standards’ details accessible will help inform user and expert feedback.
For example, Facebook has always said it does not allow terrorists, murderers and hate organizations. Now we know the company
defines a hate organization as one with “three or more people that is organized under a name, sign, or symbol and that has an ideology, statements, or physical actions that attack individuals based on characteristics, including race, religious affiliation, nationality, ethnicity, gender, sex, sexual orientation, serious disease or disability.”
On nudity — an area where Facebook users might feel the company has been inconsistent on its enforcement — the standards prohibit posts that show sexual
intercourse, genitalia, erections, exposed female nipples “except in the context of breastfeeding, birth giving and after-birth moments, health (for example, postmastectomy, breast cancer awareness, or gender confirmation surgery), or an act of protest.”
A couple of years ago, Facebook famously removed a post of the famous, Pulitzer Prize-winning photo showing a naked girl fleeing a napalm attack. The company changed its mind after widespread outcry.
Facebook has also been embroiled in different controversies over removing images of breastfeeding women.
More recently, false information
“Balancing free speech and safety is a challenge both on and off Facebook. We’ll continue working hard to get this right for our community.” — Mark Zuckerberg, Facebook CEO
on Facebook has been said to be a factor in violence between Buddhists and Muslims in Sri Lanka. Officials and other people there complain that Facebook did nothing about requests
to delete content, and to establish a direct line of communication with the company as the violence progressed, the New York Times reported over the weekend.
During his congressional testimony a couple of weeks ago — mostly about Facebook’s latest privacy scandal — lawmakers repeatedly asked CEO Mark Zuckerberg about Facebook’s content-policing practices. Some legislators expressed concern about the sale of opioids on the social network, while others questioned whether the company is biased against conservatives.
“Balancing free speech and safety is a challenge
both on and off Facebook,” Zuckerberg said in a Facebook post Tuesday. “We’ll continue working hard to get this right for our community.”
In addition to being more transparent about its guidelines, the company is expanding its appeals process so users who feel their content has been removed mistakenly can ask for a review.
Facebook, which has more than 2 billion users worldwide, has 7,500 people reviewing content in more than 40 languages, Bickert said. The number of reviewers is 40 percent higher than this time last year, she said.