In Perspective | Whistle-blower Haugen’s critique of Facebook, and the way out
Frances Haugen, a former employee, testified in front of the United States (US) Senate on the impact of Facebook’s policies and technologies with examples of case studies and company research enumerating what has long been suspected: Facebook and its products are creating and exacerbating divisions, hate and harm in today’s society.
“I’m here today because I believe Facebook’s products harm children, stoke division, and weaken our democracy,” she said in her opening statement to lawmakers. “These problems are solvable. A safer, free-speech respecting, more enjoyable social media is possible. But there is one thing that I hope everyone takes away from these disclosures, it is that Facebook can change, but is clearly not going to do so on its own.”
There were several takeaways from Haugen’s testimony. Three stand out.
Facebook engagement-based algorithms and its financial incentives are at the heart of the damage it’s causing and must be regulated; the company’s trove of data needs to be opened to outside researchers; and there is unanimity among American lawmakers that regulation is necessary.
Haugen is right
What makes Haugen’s position is the communications and company research she has leaked from Facebook.
Till now, evidence of the harm has been seen in outcomes. For instance, senator Richard Blumenthal said his team created a dummy Instagram account identifying as a 13-year-old girl, and after following “easily findable” accounts on extreme dieting, the social media service began to recommend accounts promoting self-injury.
Now, a document leaked by Haugen shows Facebook itself made similar findings of how its products push people to extreme content. “Carol’s Journey to QAnon”, a case study cited in the documents, showed that a test user created by Facebook researcher was exposed to polarising content “within one day” after they followed conservatives pages (Fox News, Donald Trump etc). The company’s recommendations algorithm “began to include conspiracy recommendations after only two days” and it took less than a week to get a QAnon recommendation.
The company research weakens Facebook’s defence that divisions that manifest on its products are a reflection of real-world factors.
Haugen adds that the company has dragged its feet on fixing these technological problems because it prioritises growth over safety. The company has denied that characterisation, suggesting again that the threats to safety are borne out of divisions and polarisations that persist offline.
So may be her solution
The former Facebook employee told lawmakers that they must step in to regulate the company. “If we had appropriate oversight, or if we reformed [Section] 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking,” Haugen said. “Because it is causing teenagers to be exposed to more anorexia content, it is pulling families apart, and in places like Ethiopia, it’s literally fanning ethnic violence.”
How does an algorithm do that? Engagement-based ranking means the part of the computer code that decides what posts you see most frequently on your screens are based on a prediction of what you are mostly likely to click or post on (or pause to read, or reshare).
In essence, she suggests having some sort of a distinction in a piece of American law – Section 230 of the Communications Decency Act – to leave out the immunity to companies for the harm that their algorithms cause.
The suggestion is crucial because Section 230, enacted in 1996, has become the subject of much scrutiny since online speech has begun to translate into greater offline harm. Section 230, to simplify it, lays down that companies will not be liable for what their users post so even if they take editorial decisions in good faith.
But the solution isn’t easy
And therein lies the challenge. Both Democrats and Republicans have assailed Section 230 — the Democrats say the law gives companies a free pass when they late up hate content, while the GOP says companies crack down on free speech by using the section as shield.
This tussle largely refers to how content — user posts — is moderated. It is significantly distinct from the algorithms that Haugen suggests need to be more controlled.
An attempt to fix a technically complex problem could easily confuse the two, especially if the lawmakers do not rise above their partisan considerations.
As it is, lawmakers have not been adequately aware of the nuances involved. For instance, senator Blumenthal pushed an Instagram representative to commit to “ending finsta” – seemingly unaware of what that meant. Finsta refers to young people setting up anonymous accounts – pseudonymity is a crucial concept for privacy.
Accurate awareness, however, is just one of the challenge that lie ahead in tech regulation.
Facebook, the leaked documents indicate, sits on trove of data relating to deeper social interactions. This makes the second of the takeaways from Haugen’s testimony cited above important — independent researchers must have access to this in order to identify what may be vast amounts of unforeseen problems that must be kept in mind when designing any regulatory framework. This in itself will be challenging since it involves sensitive user data and proprietary company information.
Another challenge will be to ensure any regulation that adds any compliance burden does not end up giving an advantage to Big Tech. Upstart companies, which can challenge their domination, are likely to struggle to keep up in the same way that some of the world’s richest companies can with the cost of compliances.
Over the course of Haugen’s testimony last week, it was clear that lawmakers have a strong appetite to step in. The tide may indeed be turning against Facebook. But it remains to be seen how far that tide will reach, or whether it turns into a tsunami of overreach.
In Perspective takes a deep dive into current issues in the word of tech and science, the visible and invisible factors at play, and their implications for the future
The views expressed are personal