Yesterday at Web Summit, Matthew Garrahan, News Editor of the Financial Times, was joined (virtually) onstage by Sir Nick Clegg. Of course, Clegg is most famous for having served as the Deputy Prime Minister of the United Kingdom during the coalition government. After taking a battering at the 2015 General Election, Clegg stepped down as leader of the Liberal Democrats and left politics completely a couple of years after. Since 2018, he has worked as the Vice-President for Global Affairs and Communications at Facebook (now Meta) with a particular focus on regulatory issues surrounding data-intensive companies.
Frances Haugen
The obvious starting point for the interview was the recent allegations made by Facebook whistleblower Frances Haugen. Just a couple of weeks ago, this former employee claimed that senior Facebook executives interfered in company processes “to allow prominent American politicians and celebrities to post whatever they wanted on its social network.” Leaked documents – branded “The Facebook Papers” – also suggested that Facebook’s algorithm is deliberately designed in such a way as to provide people with inflammatory material, such as controversial political messages, that will provoke them into engaging with the platform further. Perhaps the most troubling element of Haugen’s claims is that Facebook deliberately compromises users’ safety, wellbeing or access to the most accurate information in order to maximise profits.
Whilst insisting that “whistleblowers are entirely entitled to blow the whistle,” Clegg challenged some of Haugen’s “fundamental assertions,” particularly those relating to the make-up of the platform’s algorithm:
“[Haugen argued that] Meta algorithmically spoon feeds people, deliberately provides people with extreme, hateful, unpleasant content to keep them sort of perpetually riled up and engaged, because that somehow assists in increasing our profits. […] I’m not even fighting anyone to suggest that everybody working at Meta are angels, far from it, but I think that actually genuinely misleads the commercial self-interest of Meta and apps like Facebook, Instagram and so on.
“And then the reason I say that is because the people who pay, who generate those profits, are of course advertisers. They do not want that content next to unpleasant content. I think Francis Haugen has quite rightly pointed that herself.
“[…] Our own research shows that users will not continue to use our products if they are getting a bad experience. And that kind of makes sense, because if generating a revenue is all about having people look at ads, click on ads, and buy and sell things online, I don’t think riling people up into a semi-permanent state of fury is the best way of actually having people have a pleasant experience. And to look at ads and buy things.
“By the way, that’s the reason why we invest all that money: over $13 billion over the last few years, $5 billion in this year alone. 40,000 people work on trying to bear down on that unpleasant content. We publish every twelve weeks how much hate speech there is on, for instance, Facebook and it now stands at 0.05%. So that means for every 10,000 bits of content you see on your newsfeed, only five will be hate speech. I wish we could get it down to zero; we never will. But I do think it illustrates that our incentives are to keep actually reducing, not amplifying, such content.”
Despite this, it does seem that Facebook/Meta has responded seriously to Haugen’s claims. Indeed, the very timing of the rebranding announcement suggests that the company is trying to distance itself from the controversies surrounding the old Facebook brand. Further, as Clegg outlined, a number of practical measures have been brought forward in light of Haugen’s testimonies. Work on “Instagram Kids” has been paused. The company is developing more sophisticated parental controls, so parents can see what their children are seeing and doing online. Internal technology and algorithms are being reworked to nudge teenagers “away from dwelling over the same kind of content over and over again, and to encourage them to take a break.”
A Big Tobacco Moment?
The scope and gravity of Haugen’s claims have led many media commentators and regulators to suggest that Facebook has experience a “Big Tobacco Moment.” That is to say, the platform has been revealed as being fundamentally poisonous and malign, despite the claims of powerful lobbyists. In a sign of how seriously some are taking this, Haugen has even gone as far as to suggest that Facebook’s algorithms can undermine the entire democratic process. Unsurprisingly, Clegg “[doesn’t] think the analogy is right at all” and pointed to the good he thinks social media can do for the world:
“Billions of people use these tools – Instagram, Facebook, Whatsapp, Messenger – for free because it’s paid for through advertising. To express themselves, to communicate with loved ones, to find friends, to communicate with friends and family.
“[…] When you get into a sort of pretty sustained media cycle, some of the debate becomes one of caricature.
“[We need to remember] the facts and how these apps operate. So for instance, I keep seeing this description of the Facebook app as somehow being something where we, as individuals, just helplessly receive the content menu served up to us by the Facebook algorithm. In fact, I think over 90% of the content that you see on Facebook comes from three simple signals generated by yourself. Firstly, the friends you have. Secondly, the groups you’re part of. Thirdly, the pages that you follow. So less than 10% – which of course is still a lot on platforms as big as Facebook – is what we call “unconnected” content.”
Yet what is clear is that, despite these undoubted virtues, regulators are increasingly upping the ante. A sweeping range of new laws are likely, designed to improve transparency and address the impact social media platforms can have on civil society. The political power held by “Big Tech” and their billionaire CEOs is an issue that is becoming of increasing importance. Clegg insisted that Meta would welcome well-thought out regulations; it is, after all, still ultimately a young company in a young industry that would perhaps benefit from some direction. They may, of course, have no choice in the matter.
Misinformation overseas
An interesting point which was raised by the host was the effect of misinformation that is spread on Facebook on less developed countries. Much of the debate around this issue has centred in particular on certain elections in the United States and the United Kingdom, whilst ignoring the much greater effect such trends can have in more fragile and unstable countries. Garrahan pointed out that, of the money allocated to fight disinformation, 87% is spent in the US. Budgets for countries in the Arab world, for example, are very low. Often, Facebook/Meta does not have the relevant language capabilities or on-the-ground knowledge to start dealing with these issues when they arise. This is, Clegg agreed, a problem – but one that they are getting better at combatting:
“We made a big change in 2019; we set up a new human rights team and a new “at risk countries” team. We obviously learned some pretty searing lessons from what happened in Myanmar and we opened ourselves up to some independent reports to look at what went well and what went wrong there.
“I do think we’ve made significant progress. We now have content moderation in over 70 countries and we’re adding them all the time. So this year alone, for instance, we’ve hired content reviewers in twelve new languages including Haitian Creole, Kirundi, Tswana and Kinyarwanda.”
As Nick Clegg’s appearance at Web Summit made clear, we are at a very interesting crossroad in the evolution of tech giants like Meta. Their huge growth over the past decade or so has propelled them to a position so significant in society that regulators and governments simply must address them in some way. The massive gains the FAANG group made on the stock exchange during the pandemic also lends them, at the same time, a large degree of economic power. Any regulatory oversight has to be well-targeted, balanced and delicate to avoid a shock that could destabilise the wider markets. The challenge of governments and parliaments in drafting such regulation, and for companies like Meta to implement it effectively, will be grave.
Author: Harry Clynch
#Facebook #Meta #Regulation #WebSummit #FrancesHaugen #FAANG