Digital rights group of 86 civil authorities have wrote an open letter to Facebook requesting the company allow users to appeal whenever their posts are removed — an option which they currently do not have.
Facebook is facing lots of criticism for its content takedown and account deactivation appeals process, which was initiated alongside the company’s expanding content moderation practices.
The Digital Rights groups — which include the Electronic Frontier Foundation (EFF), the American Civil Liberties Union (ACLU), and the Digital Rights Foundation — address CEO Mark Zuckerberg, asking him to include an option for the site’s users to defend themselves against potentially unnecessary censorship.
If Facebook removes a photo, video or post, it will alert the user. However, the notification will also include an option for the user to request an additional review. They said this will lead to a review by a person on their team and will typically take 24 hours.
Facebook Violation Community Standards:
Facebook says its Community Standards-rooted in three specific principles: safety, voice and equity. The guidelines published on Facebook’s Community Standards site-divided into six categories:
- Firstly, Violence and Criminal Behavior.
- Secondly, Safety.
- Thirdly, Objectionable Content.
- Integrity and Authenticity.
- Respecting Intellectual Property.
- Finally, Content Related Requests.
According to Bickert, Facebook’s Community Standards developed by subject matter experts located in 11 offices around the world.
Comment on Posts issues:
“Many of us have worked on the issues of expression and safety long before coming to Facebook. I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher,” writes Bickert.
Finally, the group’s request Facebook reveal more data about content takedowns. Additionally, Including how much content censored, which guidelines it allegedly violated. And how many posts removed in error.
The suggestions are essentially an application of the Santa Clara Principles. Its group of rules developed by several of the same cosigners of this letter that help tech companies improve their moderation policies. As the Electronic Frontier Foundation (EFF) puts it.
“The plain language, detailed guidelines call for disclosing not just how and why platforms are removing content, but how much speech is being censored.”
Fore more interesting content, make sure you come back.