Last month, when a transcript of a Facebook internal Q&A was leaked online, it was Mark Zuckerberg’s frank take on what an Elizabeth Warren presidency would mean for big tech that hit the headlines (“It would suck”). However, his remark that ”one of the things that I love and appreciate about our country the most is that we have a really solid rule of law” is perhaps the more surprising.
Given Facebook’s legal entanglements over the past couple of years, Zuckerberg was at pains to emphasise that the company is taking recent challenges very seriously. Their recent rebrand (as FACEBOOK) and their CMO’s admission that they “considered a name-change” makes it clear that the company is feeling the reputational impact of recent scandals. It is also evident that different regulators and courts are increasingly comfortable taking on the social media giants. Therefore, with or without a President Warren, Facebook’s biggest challenges could still be ahead of them.
One such challenge has arisen from the European Court of Justice (ECJ), which ruled that Facebook could be held responsible for hate speech or defamatory statements on its platform worldwide and would be tasked with removing this harmful content too, or risk being held as non-compliant. This is significant because on similar matters, Facebook and other social media sites have typically tried to distance themselves from responsibility, using the argument that they are platforms rather than publishers. This judgment clearly signals a rejection of platforms’ longstanding justification for inaction. The reach of the judgement is also extensive, even compared to other ECJ judgements, such as the Google Right to Be Forgotten Judgement which was restricted to within EU borders. It is a clear call from the ECJ for Facebook and its peers to step up and do more to proactively halt the spread of harmful content.
The ECJ’s decision is not the only indication of the increasing legal burdens being placed on social media companies across multiple jurisdictions. This year’s Online Harms Whitepaper, published by the UK government, also mapped out a vision of the internet where social media companies meet a far higher standard of scrutiny. Like the ECJ’s decision, the Whitepaper called for social media platforms to tackle online harms within their own services, calling on them to treat this as an obligatory duty of care. The potential penalties for non-compliance mooted by the Whitepaper are severe, with the potential for platforms to be blocked in the UK. Though the Whitepaper is currently in a consultation phase, the UK government, like the ECJ, has made its intention to hold social media sites to account very clear.
Facebook need to think about the practical measures they can implement to comply with these new guidelines if, and when, they are enforced. Due to the sheer volume of content on the platform, the process is likely to be burdensome and costly. The controversy earlier this year about the working conditions of Facebook’s content moderators – work which they outsource according to demand – has shown the problems the platform faces even in a more lenient regulatory environment. Concerns have already been raised that over-reliance on automated moderation could lead to a heavy-handed approach, which will adversely affect free speech – recently Facebook’s competitor Tik Tok was lambasted in the press for their zealous approach to moderation. To compound matters, it’s likely to be a few years until AI can develop to the point of being indistinguishable from human moderation.
In playing down the prospect of a future legal challenge from President Warren, Zuckerberg argued that it’s bigger tech companies that have the resources to dedicate to developing this kind of advanced moderation capability. Aside from living up to this comment, the challenge for Zuckerberg over the coming months and years will be communicating to the many stakeholders – the President (whoever they may be), regulators, the media and Facebook users – that Facebook is paying serious attention to the growing demand for them to take ownership of the content on the platform, and ensure that it’s safe in a way that doesn’t encroach upon freedom of speech.