Facebook has announced its independent oversight board that will moderate content posted on the social media platform.
In November 2018, Facebook committed to creating an independent oversight board that will review the social media platform’s decisions about what content should be removed or remain online.
The 20-member oversight board was announced in a New York Times opinion piece by the four co-chairs.
According to the post, the board will focus on the most challenging content issues for Facebook, including hate speech, harassment, and protecting people’s safety and privacy.
“Social media affects people’s lives in many ways, good and bad. Right now, as the world endures a health crisis, social media has become a lifeline for many people, providing valuable information and helping families and communities stay connected,” the co-chairs wrote. “... We know that social media can spread speech that is hateful, harmful and deceitful. In recent years, the question of what content should stay up or come down on platforms like Facebook, and who should decide this, has become increasingly urgent”
According to The Times, the board will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram.
The 20-member board is expected to grow to 40 members and is scheduled to become operational this year.
Members on the oversight board come from different professional, cultural and religious backgrounds and have various political viewpoints, according to The Times.