Wednesday, June 29, 2022

How Facebook Undermined the Supervisory Board

Must read

Shreya Christinahttps://cafe-madrid.com
Shreya has been with cafe-madrid.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider cafe-madrid.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Today, let’s talk about the most notable conflict to date between Meta and its Oversight Board, an independent organization the company founded to help it navigate the toughest questions related to policy and content moderation.

Since before the board was made, it has: faced with criticism that it primarily serves a public relations function for the company formerly known as Facebook. The board relies on funding from Meta, it has a contractual relationship with the use of user data, and the founding members are handpicked by the company.

Helping with the perception that it is primarily a PR project is the fact that Meta and the board have rarely clashed so far. In the first quarter of its existence, of 18 recommendations the board has made to Meta, the company implemented 14† And while it often settles against Facebook’s content moderators, ordering deleted posts to be restored, none of those reversals have sparked any significant controversy. (Also from Facebook’s perspective, the more the board flips it, the more credible it is, and thus the more blame it can take for unpopular appeals.)

That’s what made this week’s statements, published by both sides, so remarkable.

After the Russian invasion of Ukraine in February, Meta asked the board to issue an opinion on how to moderate wartime content. The conflict had raised a series of difficult questions, including: under what circumstances users can post photos of dead bodies or videos of prisoners of war criticizing the conflict

And in the most prominent content moderation issue of the invasion yet, Meta decided to temporarily allow calls for violence against Russian soldiers, Vladimir Putin and others

All of this raised important questions about the balance between free speech and user safety. But after asking the board to think about it, Meta changed his mind – asking the board members not to say anything at all.

From the company’s blog post

Late last month, Meta withdrew a policy advisory request (PAO) regarding the Russian invasion of Ukraine, which had previously been referred to the Supervisory Board† This decision was not taken lightly – the PAO was withdrawn due to ongoing safety and security concerns.

Although the PAO has been revoked, we stand by our efforts related to the Russian invasion of Ukraine and believe we are taking appropriate steps to protect speech and balance lingering security concerns on the ground.

In a response, the board said in a statement that it was “disappointed” by the move:

While the Board of Directors understands these concerns, we believe the request raises significant concerns and are disappointed with the company’s decision to withdraw it. The Council also notes that the withdrawal of this request does not diminish Meta’s responsibility to carefully consider the lingering content moderation issues arising from this war, which the Council continues to monitor. Indeed, the importance for the company of defending freedom of expression and human rights has only increased.

Both statements were extremely vague, so I spent a day with people familiar with the case who could tell me what had happened. This is what I learned.

One of the most disturbing trends of the past year has been the way authoritarian governments in general, and Russia in particular, have used intimidation of workers on the ground to force platforms to do their bidding. Last fall, Apple and Google both have removed from their respective stores an app that would allow anti-Putin forces to organize before an election† In the aftermath, we learned that Russian agents had personally threatened their employees with jail time or worse.

The lives of those workers — and their families — have only gotten harder since Putin’s invasion. The country Draconian laws passed prohibiting truthful discussion of warand the combination of those United States and European laws and sanctions has forced many platforms to withdraw services from Russia entirely.

In the wake of Meta’s decision to allow calls for violence against the invaders, Russia said Meta engaged in “extremist” activities† That could potentially put hundreds of Meta employees in jail. And while the company has now successfully removed its employees from the country, the extremist language could mean they will never return to the country as long as they work at Meta. In addition, it could mean that the families of workers in Russia could still be victims of persecution.

There is precedent for both outcomes under Russia’s extremism laws.

So what does the Supervisory Board have to do with it?

Meta had asked for a fairly broad opinion about her approach to moderation and Russia. The board has already shown a willingness to make far-reaching policy recommendations, even for smaller cases submitted by users. After seeking advice, the company’s legal and security teams became concerned that anything the board said could be used in some way against employees or their families in Russia, now or in the future.

Technically, the Oversight Board is a separate entity from Meta. But many Westerners still refuse to recognize that distinction, and corporate lawyers feared Russia wouldn’t either.

All of this is compounded by the fact that tech platforms have so far received little to no support from the United States or the European Union in their struggle to keep key communications services in Russia and Ukraine going. It’s not clear to me what Western democracies could do to allay platforms’ fears about how Russia would treat workers and their families. But conversations with executives at several major tech companies over the past year have made it clear that they all feel like they’re on the run.

That said, the news still packs a punch to the Oversight Board’s already fragile credibility — and arguably reduces its value to Facebook. The company spent several years and $130 million establishing an independent body to advise it on policy issues. Asking that body for advice – advice that wouldn’t even be binding on the company – and then deciding too late that such advice could be dangerous casts doubt on the whole company’s point. If the only role of the Board of Trustees is to handle the easy questions, why bother?

Facebook and the board declined to comment on me outside of their statements. It’s fair to note that despite the turnaround here, the company has resisted Russia in a number of key ways, including sticking to that decision to let Ukrainians demand Putin’s death. Meta could have switched to Russia after that and chose not to.

At the same time, we are again finding that Facebook executives are failing to understand risks and public perception at a critical time. Russia has been threatening platform workers since at least last September. Whatever danger there was to employees and their families was long before Facebook asked its board for advice. Only to realize that weeks later…well, talk about a mistake.

I’m on the list that I say that the Board of Trustees has changed Facebook for the better† And when it comes to authoritarians threatening platform workers, tech companies have alarmingly few options at their disposal. The Russia case, in this and so many other situations, was really a no-win situation.

But that doesn’t mean it won’t do collateral damage to both Meta and his board. Critics have always feared that if the stakes ever got high enough, Facebook would blink and decide to make all the relevant decisions itself. And then Vladimir Putin went to invade his neighbor, and the critics were right.

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article