GO NEWS DAILY

Instagram will alert parents to teens’ repeated suicidal or self-harm searches



The move comes as Instagram’s parent company, Meta, and other social media platforms face ongoing scrutiny over the safety of their products, particularly for young people.

In Los Angeles, a consolidated group of cases with more than 1,600 plaintiffs, including more than 350 families and over 250 school districts, are accusing Instagram, YouTube, TikTok and Snap of deliberately designing platforms to be addictive to young users. Last week, Meta CEO Mark Zuckerberg said in court that Instagram is meant to build “a community that is sustainable” and not designed to addict young users. TikTok and Snap settled ahead of the trial.

Meta, and in particular Instagram, have taken some steps to address concerns around the use of its platforms by teens. In 2024, Instagram introduced accounts specifically for teens meant to restrict who can contact them. In October, the company said it would overhaul its approach to teens’ accounts, limiting their access to certain content in an attempt to make the experience closer to viewing PG-13 movies.

Instagram already blocks content related to suicide or self-harm from reaching teens’ accounts. However, families of teens who died by suicide allege in their lawsuits that Instagram is responsible for multiple sextortion scams targeting teens, NBC News previously reported.

Meta spokesperson Sophie Vogel told NBC News that teens can also talk to Instagram’s existing artificial intelligence tool later this year to seek support, and parents will also be notified of conversations related to suicide or self-harm.

If you or someone you know is in crisis, call or text 988 or go to 988lifeline.org to reach the Suicide & Crisis Lifeline. You can also call the network, previously known as the National Suicide Prevention Lifeline, at 800-273-8255 or visit SpeakingOfSuicide.com/resources.



Source link

Exit mobile version