Meta’s Oversight Board wants to see radical changes to the company's VIP moderation system

Meta’s Oversight Board wants to see radical changes to the company’s VIP moderation system

Posted on

Meta Platforms Inc.’s Oversight Board said today that it’s time to change the moderation system in which certain high-profile users on Facebook and Instagram receive preferential treatment.

During the time Meta was being taken through the wringer last year when the company was accused of putting profits before the common good, it was revealed that millions of users on the platforms had been granted special considerations in terms of how their content was moderated. The system, called “cross-check,” apparently gave high-profile users, such as celebrities, politicians, and generally anyone with a huge following, impunity against Meta’s usual moderation policies.

At the time, the Oversight Board said that such a system should not have been implemented in secrecy. Meta responded by saying it has always had “layers” of moderation. Still, such secrecy and the very idea of Meta having a cast of VIPs on its platforms did not go down well with much of the public.

The Oversight Board explained in a post published today that due to the incredibly high numbers of posts that get flagged by an algorithm each day, the company does not have the ability to have each post reviewed by a human. That’s why Meta has the cross-check system, to ensure people with large followings do not have content taken down by mistake. Most people’s posts would come straight down as soon as flagged, but not the VIPs. The board explained that this had more to do with money, though, than morality.

“While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” it wrote today. “The Board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”

It also said that Meta hasn’t even bothered to analyze the cross-check system to ascertain if it allowed damaging or dangerous posts to remain in public view. It added that Meta should “take steps to mitigate the harm caused by content left up during additional review, and radically increase transparency around its systems.”

The board said the system was flawed, although it agreed that perhaps at times, it might be in the public’s interest to have such a system. If a political leader, for instance, has his or her post mistakenly taken down, Meta would surely take some flak. Nonetheless, the board said that it becomes problematic when “journalists and civil society organizations” aren’t given the same immunity.

The board said there is still a lack of transparency concerning the cross-check system, which needs to change. It said accounts that belong to such a system should be marked as so, and Meta should publish key findings of the system to let people know how effective it is.

Photo: Dima Solomin/Unsplash

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *