Does Facebook’s content policy vary depending on who you are? Read on to find out!
On September 13, 2021, Wall Street Journal (WSJ) came out with a detailed investigation surrounding Facebook’s content policies titled, “The Facebook Files”. Learn more about the incident here. What jumped out from this investigation was that the “X-check” or cross-check program used by Facebook, which allegedly gives some of the company’s high-profile users special treatment.
The list of those who are allowed to override Facebook’s rules includes former U.S. President Donald Trump, U.S. Senator Elizabeth Warren and Facebook founder Mark Zuckerberg. This list is known as the X-check list, which includes 5.8 million people as of 2020.
To get clarity on what this figure means, let’s try to understand what this cross-check system is and what it looks like in practice.
Keeping Facebook safe with cross-checking
In 2018, Facebook came out with a detailed piece that read, “we want to make clear that we remove content from Facebook, no matter who posts it when it violates our standards”. It explained that there is no special protection for any group no matter what side of the political spectrum it falls under.
However, despite what Facebook might say, its cross-checking system entails that the content of certain pages and profiles is to be given an additional review. This is done to ensure that content regulation policies have been applied accurately. The company gave the example of a civil rights activist (the name of the activist has not been mentioned in their statement) whose content was cross-checked, so that his attempts at raising awareness about hate speech would not be deleted off the platform. Content put out by media organizations like BBC, the Verge and Channel 4 is also cross-checked under Facebook’s policy.
Facebook and content regulation
In the same year, Facebook also released its 27-page list of community standards. The exhaustive list contains a wide range of topics, from hate speech to sexual abuse, that are censored on the website.
These community standards came soon after Zuckerberg was called to testify in front of the U.S. Congress about Facebook’s alleged involvement with the British consulting firm Cambridge Analytica. The firm had harvested the data of millions of Facebook users and utilized it to bolster the election campaigns of Ted Cruz and Donald Trump in the 2016 U.S. election.
“It’s clear now that we didn’t do enough to prevent these tools from being used for harm. That goes for fake news, foreign interference in elections and hate speech, as well as developers and data privacy,” Zuckerberg said to Congress in April 2018, taking responsibility for the data scandal.
Facebook attempted to be more transparent with its users after concerns on data misuse flared up. With these new community standards, Facebook began informing users what their content was being flagged for.
Whitelisting
WSJ details how some high-profile users’ content is exempt from adhering to Facebook’s guidelines. They gave the example of Brazilian football player Neymar, who responded to rape accusations in 2019 by releasing the accuser’s nude photos and conversations on Facebook.
Typically, the post should have been taken down under Facebook’s community standards for sexual exploitation of adults. However, the post stayed up and was viewed by over 56 million users before it was removed. Neymar’s account was given a single strike and was not deleted as per Facebook’s standard protocol.
This process of favoritism was called “whitelisting” by WSJ. An internal review of the incident by Facebook found that whitelisting poses “numerous legal, compliance, and legitimacy risks for the company and harm to our [Facebook’s] community”. The review also added that this favoritism is not publicly defensible.
Putting out media fires
Facebook’s communications director Andy Stone took to Twitter to make a statement the same day as WSJ released the findings of its investigations. He reiterated that Facebook had been open about its cross-checking back when it was first made in 2018.
“Since 2019, when we, ourselves, promoted that the company would take this approach to politicians’ speech, there have been literally hundreds of news stories critical of our approach,” Stone clarified the company’s stance on free speech for politicians. He said that the WSJ’s investigation cites Facebook’s own analysis on the need to improve their cross-checking program.
The company has been trying to make changes after WSJ’s investigation. The Facebook Oversight Board will release its first transparency report with an update on the cross-check issue next month. Their efforts to swiftly address this problem reflect that Facebook might be more transparent about its practices in the future.
Header image courtesy of Unsplash