Opinion: This may be the only way to stop social media from harming our children

Opinion: This may be the only way to stop social media from harming our children

Note to the Editor: Kara Alaimo, associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Alcove Press will publish her book, “This Feed Is on Fire: Why Social Media Is Toxic to Women and Girls – And How We Can Reclaim It,” in 2024. The views expressed in this commentary are some own opinions. Read more opinion on CNN.



CNN

Tech executives could face the prospect of time behind bars in Britain if they willfully ignore rules designed to protect children online under a proposed amendment to an online safety bill.

Kara Alamo

As currently written, the bill would require social media companies to identify and remove content that encourages self-harm, including content that glorifies suicide, and not allow children under the age of 13 their -using platforms. In a written statement to parliament, the Secretary of State for Digital, Culture, Media and Sport Michelle Donelan said that tech leaders who act “in good faith” would not be affected, but those who “consent or comply” could do without the new rules to follow. face jail time.

We hope this bill passes. For a long time, the technology leaders have shied away from responsibility for the harmful effects their products can have on those who use them. And while a law like this amendment to the UK bill is unlikely to ever work in the US — given its fierce pro-business climate, broad constitutional protections for free speech, and regulations that limit liability for internet platforms beyond what their customers post online – other countries should consider similar punishments for tech executives.

The tech industry disagrees, of course. TechUK, the country’s industry trade association, said the prospect of jail time would not make social networks safer for children but would discourage investment in the country. But I think this law would do just the opposite: serve as a wake-up call to technology leaders that they are accountable for what the products they build do.

Part of the reason technology executives have avoided personal responsibility for their impact on society for so long is the way we think about social media. We talk about what happens in real life to distinguish it from what happens online. But the effects of social networks on users – especially children – often depend on the “real” world.

For example, in September, a British coroner ruled that “the negative effects of online content” were partly to blame for the suicide of 14-year-old Molly Russell. The Guardian reports that in the six months before she took her own life in 2017, data from Meta showed Molly viewed 2,100 pieces of content related to self-harm, depression and suicide on Instagram.

Meta, Instagram’s parent company, acknowledged that Molly saw content that violated its community standards, and in 2019, added new policies against graphic images depicting self-harm. It also began offering resource links to users looking at depression content.

But, in 2021, US Senator Richard Blumenthal’s staff set up an account pretending to be a 13-year-old girl and followed accounts that promoted eating disorders. Instagram then promoted eating disorder accounts with names like “eternal hunger”. Instagram told CNN that it removed the accounts and that they should not have been allowed in the first place since they violated the platform’s rules against content that promotes eating disorders.

And a shocking report released by the Center for Combating Digital Hate last month explains what happened when researchers set up TikTok accounts purporting to be those of 13-year-olds, quickly turning to mental health issues and body image and liked. Within 2.6 minutes, TikTok was showing suicide content.

Within eight minutes, the platform was recommending content about eating disorders. When an account used a name that suggested the user was at risk of an eating disorder, TikTok served up even more of this kind of horrible content. TikTok said the content seen by researchers does not reflect what other users see due to the study’s limited sample size and time constraints and will remove content that violates its standards and provide resources to those who need them.

And former Facebook staffer Frances Haugen pointed out in 2021 that Meta is well aware of the harmful effects Instagram has on some younger users. But Haugen said the company chooses to prioritize making money over protecting children. Meta said it is developing controls and parental supervision features to help teenagers regulate their Instagram use, and CEO Mark Zuckerberg disputed Haugen’s characterization of the company as false.

In the United States, members of Congress have passed only two laws regulating how companies interact with children online in the past 25 years — a law that currently requires parental consent for sites to collect data about children. 13 years of age and one whose sites are responsible for facilitating human trafficking and prostitution.

There is no reason that technology leaders should be immune from liability for what their products can do to users. This amendment in the UK It should also be a wake-up call to parents and other social media users about the dangers we and our children may face online.

If prison sounds bad, it’s nothing compared to the price Molly Russell and her family paid. But, five years after her suicide, social platforms are still serving up the same kind of toxic content to vulnerable young people. This has to stop – even if it means putting tech executives behind bars.

Leave a Reply

Your email address will not be published. Required fields are marked *