An illustration picture taken through a magnifying glass on March 28, 2018 in Moscow shows the icon for the social networking app Facebook on a smart phone screen. Facebook said on March 28, 2018 it would overhaul its privacy settings tools to put users "more in control" of their information on the social media website. The updates include improving ease of access to Facebook's user settings, a privacy shortcuts menu and tools to search for, download and delete personal data stored by Facebook. / AFP PHOTO / Mladen ANTONOVMLADEN ANTONOV/AFP/Getty Images
Facebook is to create a court-style board to rule on site content © AFP

When Facebook said it would create a court-style board to rule on site content, the social media group’s expansive influence on society was again drawn into the spotlight.

The announcement in September — shortly after a controversial plan by the company to launch a global virtual currency — promises to let Facebook users challenge its decisions.

The move was widely seen as an attempt to appease critics and fend off charges of political bias and censorship.

Despite employing 15,000 content reviewers, Facebook has been accused of being too slow to remove illegal content, such as live footage from terrorist shootings, and of inconsistent decisions about acceptable content for its site.

“Lawmakers are clearly gunning for Facebook,” says Phillip Souta, a technology specialist at law firm Clifford Chance. “Facebook is saying: ‘We get it, we’re doing what we can about contentious content. Give us some time.’ ”

Like other large social media companies, Facebook is spending more time and money monitoring and censoring extreme messages, photos and videos published by its 2.45bn users.

Facebook’s “oversight board” is due to start deliberating next year, and is being presented as the most ambitious step yet by Big Tech to create an impartial system for content arbitration. The board — whose 40 members will be paid through a trust funded by the company but not include Facebook employees — has been given the task of balancing freedom of speech with protection from unlawful content and hate speech.

Moderating content is problematic for all big social media platforms, says Sarah Roberts, assistant professor of information studies at University of California, Los Angeles. “They wish they could be out [of content moderation]. But that’s not going to happen and if anything, it’s going to increase.”

Technology and Society illustration by Øivind Hovland
© Øivind Hovland

When a Facebook user disagrees with how the company enforces its rules, the user will be able to request an appeal to the oversight board.

Facebook says the board will be independent, and that its content decisions will be binding in most cases — the exception being potential violations of local law — even if Facebook disagrees. The board is likely to consider serious cases that have a wider “real-world impact”, Facebook says.

Potential cases could include disagreement with Facebook’s decision to remove content judged to be hate speech or bullying; removing a page promoting terrorism; or allowing nudity on the site because the content is deemed newsworthy. “We make millions of content decisions every week,” wrote Mark Zuckerberg, Facebook founder and chief executive. “I don’t believe private companies like ours should be making so many important decisions about speech on our own.”

Experts have mixed views on the board. Dia Kayyali of Witness, a group that helps people use video and technology to protect human rights, says the board is a “serious attempt” to make decisions about Facebook content.

Tommaso Valletti, professor of economics at Imperial College Business School and former chief economist of the European Commission, says the board is a “PR stunt” that at best will “take the blame on Facebook’s behalf”.

Facebook should be overseen by governments and regulators, not by a structure chosen by the company whose function will depend on Facebook’s existence, Mr Valletti says.

Wolfie Christl, a technology researcher and digital rights activist, is also unimpressed by the board. “Facebook may use it as a cheap tool to outsource responsibility and shield its business practices from real democratic oversight,” he says.

Alexander Brown, a reader in political and legal theory at the University of East Anglia who is writing a report about online hate speech for the Council of Europe, says the move is a positive one. Facebook’s commitment to abide by the board’s rulings is unprecedented and will create accountability and transparency in content moderation, he says.

However, Facebook’s decision that the board will not consider cases of potentially illegal content could limit its scope and usefulness, he says.

This distinction is a “missed opportunity” and “perplexing”, Mr Brown says.

Some experts say Facebook’s oversight board may be an attempt to pre-empt government regulation of content on social networks. In Europe, for example, Facebook is under pressure to remove illegal content faster or be fined.

Bernie Hogan, senior research fellow at the Oxford Internet Institute, says there could be a conflict of interest if a board decision is against Facebook’s commercial interests.

For example, if it ruled that Facebook was correct to publish news and images about anti-Beijing protests in Hong Kong, there could be a backlash against Facebook from Chinese authorities.

Facebook says a law firm is screening potential board members for “actual or perceived conflicts of interest” that could compromise their decisions.

However, some experts are reserving their judgment. “I’ll be convinced by Facebook’s [oversight] board when it makes a decision on content that slows Facebook’s growth,” Mr Hogan says.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments