![]() “Clap if you can hear me,” the moderator, a woman dressed in a black jumpsuit, said. I sneaked in late and settled near the front. The participants had all signed nondisclosure agreements. The company had convened the group to discuss the Oversight Board, a sort of private Supreme Court that it was creating to help govern speech on its platforms. There were also party favors: Facebook-branded notebooks and pens. The room was laid out a bit like a technologist’s wedding, with a nametag and an iPad at each seat, and large succulents as centerpieces. The board determined that while the post made false claims about a COVID cure, it should be restored because it did not pose imminent harm, a crucial element of Facebook's policies on misinformation.On a morning in May, 2019, forty-three lawyers, academics, and media experts gathered in the windowless basement of the NoMad New York hotel for a private meeting. Trump, but not proven to prevent COVID-19. ![]() The post criticized a French regulatory agency for refusing to authorize hydroxychloroquine for use against COVID-19 - a drug touted by some, including Mr. The board's final decision restored a post that criticized France's health strategy and falsely claimed that a cure for COVID-19 exists. The board advised Facebook to better notify users of its rules, and to more clearly designate which organizations and individuals can be considered "dangerous." The board determined, however, that those policies were not clearly outlined to the public, and this user was not told which policy their post violated. The user shared the quote without context, but later told the board that the intent was to condemn Goebbels and draw a comparison between the sentiment in the quote and Trump's presidency.įacebook's policy is to treat quotes attributed to dangerous individuals as expressions of support, unless the user adds context to suggest they condemn that individual. The board elected to restore a post that incorrectly attributed a quote to Nazi Germany leader Joseph Goebbels in its fourth decision. The organization urged Facebook to change its policies to notify users when their content is moderated by automated systems, and to allow users to appeal certain moderation decisions to humans. "The incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns," the board wrote. The board disagreed, writing that the case was important. The Oversight Board wrote that after it selected this case, Facebook restored the post and called the removal a technical error, then asked the board not to hear this case. The post by a user in Brazil aimed to raise awareness of breast cancer symptoms and included photos of female breasts and nipples that showed signs of cancer. The board wrote, "In light of the dehumanizing nature of the slur and the danger that such slurs can escalate into physical violence, Facebook was permitted in this instance to prioritize people's 'Safety' and 'Dignity' over the user's 'Voice.'"Ī third ruling restored an Instagram post removed by an automated system for violating the company's standards on nudity. After commissioning an independent linguistic analysis, the Oversight Board determined that the word was in fact a dehumanizing label for Azerbaijani people. ![]() The board upheld Facebook's decision to remove a post that used Russian language wordplay in its second ruling, which involved what Facebook said could be understood as an ethnic slur. After Facebook removed the post, the Oversight Board determined Facebook's initial translation may have been inaccurate and ruled that while the statement was pejorative toward Muslims, it did not rise to the level of hate speech. The post implied that there was something wrong with Muslim men. The board's first ruling involved a user in Myanmar who posted in Burmese, questioning the lack of response by Muslims to the treatment of Uyghur Muslims in China. The board upheld just one of Facebook's decisions, which removed a Russian-language post that used an ethnic slur against Azerbaijanis. "Their recommendations will have a lasting impact on how we structure our policies," she said. The five cases - which Facebook will use as precedents to decide on similar cases - included a decision to remove a post that pejoratively implied Muslims were inferior, a breast cancer awareness post that depicted female nipples, a post that allegedly quoted a German Nazi leader and a post that falsely claimed a cure for COVID-19 exists.įacebook's Vice President of Content Policy Monika Bickert said the company will "take to heart" the board's suggestions.
0 Comments
Leave a Reply. |