Opinion: This may be the only way to prevent social media from harming our children | CNN

Editor’s note: Kara Alaimo, Associate Professor of Communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book, “This Feed Is on Fire: Why Social Media Is Toxic for Women and Girls — And How We Can Reclaim It,” will be published by Alcove Press in 2024. The views expressed in this commentary are her own. Read more opinions on CNN.



CNN

Tech executives could face the prospect of spending time behind bars in Britain if they willfully ignore rules designed to protect children online under a proposed amendment to an online safety bill.

As currently written, the bill would require social media companies to identify and remove content that promotes self-harm, including content that glorifies suicide, and would not allow children under 13 to use their platforms. In a written statement to parliament, Secretary for State for Digital, Culture, Media and Sport Michelle Donelan said tech leaders acting in “good faith” will not be affected, but those who “consent or collude” not to following the new rules might do it. face jail time.

Let’s hope this bill passes. For too long, technology leaders have shied away from responsibility for the harmful impact their products can have on those who use them. And while it’s unlikely that a law similar to this amendment to the UK bill will ever pass in the US — given its fiercely pro-business climate, broad constitutional protection of free speech, and regulations that limit the liability of internet platforms for what their clients post online: Other countries should consider similar sanctions for tech executives.

The tech industry, of course, disagrees. TechUK, an industry trade association in the country, said the prospect of jail time would not make social media any safer for children, but would discourage investment in the country. But I think this law would do just the opposite: It would serve as a wake-up call to tech leaders that they are responsible for what the products they build do.

Part of the reason tech executives have evaded personal responsibility for their impact on society for so long is because of the way we think about social media. We talk about what happens in real life to distinguish it from what happens online. But the effects that social networks have on users, especially children, are often very much felt in “real” life.

For example, in September, a British coroner ruled that the “negative effects of online content” were partly to blame for the suicide of 14-year-old Molly Russell. The Guardian reports that in the six months before he took his own life in 2017, Meta data revealed that Molly viewed 2,100 pieces of content related to self-harm, depression and suicide on Instagram.

Instagram’s parent company Meta admitted that Molly viewed content that violated its community standards, and in 2019 added new policies against graphic images depicting self-harm. She also began offering resource links to users viewing depressing content.

But, in 2021, US Senator Richard Blumenthal’s staff created an account purporting to be that of a 13-year-old girl and followed accounts promoting eating disorders. Instagram then promoted eating disorder accounts with names like “forever hungry.” Instagram told CNN that it removed the accounts and that they should not have been allowed in the first place as they violated the platform’s rules against content that promotes eating disorders.

And a terrifying report the Center to Counter Digital Hate published last month explains what happened when researchers created TikTok accounts pretending to be of 13-year-old boys, quickly stopping and liking mental health and body image content. In 2.6 minutes, TikTok was showing suicidal content.

Within eight minutes, the platform recommended content on eating disorders. When an account used a name that suggested the user was vulnerable to an eating disorder, TikTok offered up even more of this type of gruesome content. TikTok has said that the content researchers viewed does not reflect what other users see due to the study’s limited sample size and time constraints and that it removes content that violates its standards and provides resources for those who need them.

And former Facebook employee turned whistleblower Frances Haugen revealed in 2021 that Meta is well aware of the ill effects Instagram has on some younger users. But Haugen said the company chooses to prioritize making money over protecting children. Meta has said it is developing parental monitoring features and controls to help teens regulate their use of Instagram, and CEO Mark Zuckerberg disputed Haugen’s characterization of the company as false.

In the United States, members of Congress have passed just two laws regulating how companies interact with children online in the last 25 years: One requires parental consent for sites to collect data on minor children the age of 13 and one that holds the sites accountable for facilitating human trafficking and prostitution.

There is no reason why technology leaders should be exempt from responsibility for what their products can do to users. This amendment in the United Kingdom it should also be a wake-up call to parents and other social media users about the dangers we and our children could face online.

If jail sounds draconian, it’s nothing compared to the price Molly Russell and her family have paid. But five years after her suicide, social platforms continue to deliver the same type of toxic content to vulnerable youth. This must stop, even if it requires putting tech executives behind bars.

Leave a Reply

Your email address will not be published. Required fields are marked *