Civil discourse and social media: Can they coexist? Part 2
by Nadia Diakun-Thibault
Frances Haugen, former Facebook data engineer and scientist, product manager, disclosed tens of thousands of Facebook’s internal documents to the Securities and Exchange Commission and The Wall Street Journal in 2021. She has appeared before the US Congress and the UK Parliament. Both the United States and the United Kingdom are considering legislation that would rein in and neutralize the negative effects of social media content on children.
On October 5, U.S. Senator Richard Blumenthal (D-CT), Chair of the Subcommittee on Consumer Protection, Product Safety, and Data Security, noted in his opening remarks that, “The damage to self-interest and self-worth inflicted by Facebook today will haunt a generation. Feelings of inadequacy and insecurity, rejection, and self-hatred, will impact this generation for years to come. Our children are the ones who are victims. Teens today looking at themselves in the mirror, feel doubt and insecurity. Mark Zuckerberg ought to be looking at himself in the mirror today. And yet, rather than taking responsibility and showing leadership, Mr. Zuckerberg is going sailing. His new modus operandi, no apologies, no admission, no action, nothing to see here.”
Ms. Haugen’s testimony confirmed that the social network platform with global reach, with ancillary platforms like Instagram, was profiting not only from advertisers, but from the social engagement of young adults. The algorithmic methods Facebook used were also very dangerous.
“Facebook knows that their engagement-based ranking, the way that they pick the content in Instagram for young users, for all users, amplifies preferences,” she testified.
She added, “And they have done something called a proactive incident response where they take things that they’ve heard, for example, can you be led by the algorithms to anorexia content? And they have literally recreated that experiment themselves and confirmed, yes, this happens to people. So, Facebook know that they are leading young users to anorexia content.”
But children are not the only ones affected by the corporate practices and algorithms of Facebook. The amplification of content, often 600-fold, leads to upheavals of public order and ends with events like the January 6, 2021, insurrection in the United States and the attack on the Capitol.
There are no geographical boundaries for such effects. In her appearance before the U.K. Parliament, Ms. Haugen stated, “I have no doubt that the events we’re seeing around the world, things like Myanmar, Ethiopia, those are the opening chapters, because engagement-based ranking does two things: one, it prioritizes and amplifies divisive, polarizing, extreme content; and two, it concentrates it. …Facebook comes back and says, ‘Only a tiny sliver of content on our platform is hate’, or, ‘Only a tiny sliver is violence’. One, they can’t detect it very well, so I don’t know if I trust those numbers; but two, it gets hyper-concentrated in 5% of the population. And you only need 3% of the population on the streets to have a revolution, and that’s dangerous,” she said.
Two instances in Ms. Haugen’s testimony illustrate the very real dangers, not of ‘algorithms’, but of their misuse. When profit is the primary motive, it turns a blind eye to the predatory nature of profit margins. When an organization such as Facebook chooses to ignore their own internal evidence of the harm being done, Facebook chose no apologies, no admission, and no action, as Senator Blumenthal noted.
Amplification in an Echo Chamber:
In her UK testimony, Frances Haugen highlighted the effects of ‘Groups’: “One of the things that happens in aggregate is the algorithms take people who have very mainstream interests, and they push them towards extreme interests. You can be someone centre-left, and you’ll get pushed to radical left. You can be centre-right, you’ll be pushed to radical right. You can be looking for healthy recipes, you’ll get pushed to anorexia content.” She also underlined that, “There are examples in Facebook’s research of all of this.”
“One of the things that happens with groups, and with networks of groups, is that people see echo chambers that create social norms. … When that context is around hate, now you see a normalization of hate and normalization of dehumanizing others. And that’s what leads to violent incidents.”, she concluded.
These extracts from Ms. Haugen’s testimony, both before Congress and the UK Parliament, should be clear evidence that a ‘global’ platform has ‘global’ effects without constraints, restraints, or care.
Canada must follow suit with legislation to protect children, to protect young adults, and to ensure that misinformation and disinformation are checked at origin, and the purveyors are held accountable.