NEW YORK (AP) — The Latest on Facebook’s content moderation practices (all times local): 3:50 p.m. Facebook says it’s setting up an independent body for people to appeal decisions to remove — or leave up…
NEW YORK (AP) — The Latest on Facebook’s content moderation practices (all times local):
Facebook says it’s setting up an independent body for people to appeal decisions to remove — or leave up — posts that may violate its rules.
CEO Mark Zuckerberg said Thursday that the company wants to have such a review body by the end of next year. Appeals are currently handled internally.
Facebook employs thousands of people to review posts, photos, comments and videos for violations. Some things are also detected without humans, using artificial intelligence. But Zuckerberg says creating an independent appeals body will prevent the concentration of “too-much decision-making” within Facebook.
Facebook has faced accusations of bias against conservatives — something it denies — as well as criticism that it does not go far enough in removing hateful content.
The move comes as the company is releasing its latest report on how it’s been enforcing its community standards, which ban things like hate speech and nudity.
Facebook says it’s making progress on deleting hate speech, graphic violence and other violations of its rules and detecting problems before users see them.
The company released its second report Thursday detailing how it enforces community standards on hate, nudity and other posts. Compared with its May report, Facebook says it has doubled the amount of hate speech it detects proactively, before users report violations. The new report covers April to September.
The report comes a day after The New York Times published an extensive report about the company’s strategy it described as “delay, deny and deflect” to deal with crisis after crisis over the past two years. This included hiring a Washington public relations firm, Definers, to discredit opponents. Facebook said Thursday it has cut ties with the firm.