Skip to content

Inside Meta’s supervisory board: pushing boundaries for 2 years

    The matter eventually came up at a March 2022 meeting with Clegg, who seemed taken aback by the board members’ frustration. He promised to break the block and a few weeks later the board finally got the tool it should have had from the start. “We had to fight them to get it, which was mind-boggling,” said Michael McConnell, a Stanford law professor who is one of the board’s co-chairs. “But we succeeded.”

    The skirmish was hardly settled when another incident caused the water to sway. When Russian troops invaded Ukraine last February, Facebook and Instagram were quickly flooded with questionable, even dangerous content. Messages promoting violence, such as “death to the Russian invaders,” were a clear violation of Meta’s policies, but banning them could indicate that the company was looking for those invaders. In March, Meta announced that it would temporarily allow such a violent speech in this scary case. It turned to the board for support and asked for a policy advice. The board accepted the request and was eager to consider the human rights issue. It has issued a statement and made arrangements to inform reporters about the upcoming case.

    But just before the board announced its new case, Meta abruptly withdrew the request. The reason stated was that an investigation could put some Meta employees at risk. The board of directors formally accepted the explanation, but debunked it in private meetings with the company. “We made it very clear to Meta that it was a mistake,” said Stephen Neal, the chairman of the Oversight Board Trust, noting that if security was indeed the reason, that would have been clear before Meta asked for the policy advice.

    When I asked if Neal suspected that the board’s enemies wanted to prevent it from interfering with a hot button issue, he didn’t deny it. In what appeared to be an implied setback, the board of directors passed a case that addressed exactly the issues raised by Meta’s withdrawn advice. It concerned a Russian-language post by a Latvian user who showed a body, presumed dead, lying on the ground and quoted a famous Soviet poem that reads: “Kill the fascist so that he will lie on the spine of the ground … Dead him! Kill him!”

    Other members also noted the mixed feelings within Meta. “There are plenty of people in the company that we’re more of an annoyance to,” McConnell says. “Nobody really likes people looking over their shoulder and criticizing.”

    Since the board members are experienced people who were probably elected in part because they are not bombers, they are not the type to declare war on Meta. “I don’t approach this job thinking that Meta is bad,” said Alan Rusbridger, a board member and former editor of the guard. “The problem they are trying to solve is one that no one on Earth has ever tried before. On the other hand, I think there’s been a pattern where they get carried away screaming and kicking to give us the information we’re looking for.”

    There are worse things than no information. In one case, Meta gave the board the wrong information – which could soon lead to the most damning decision yet.

    During the Trump In this case, Meta researchers had called the board a program called Cross Check. It essentially gave special treatment to certain accounts of politicians, celebrities and the like. The company characterized it for the board of directors as a limited program with only “a small number of decisions”. Some board members saw it as inherently unfair, and in their recommendations in the Trump case, they asked Meta to compare the error rates in its Cross Check decisions with those on regular posts and accounts. In short, the members wanted to make sure that this strange program was not a card to get the powerful out of prison.