Meta Platforms to Alert Parents if Teens Search Self-Harm Content on Instagram
Meta announced an important notification feature on Instagram on Thursday for parents. They will receive notifications if their children repeatedly attempt to access suicidal content over a short period of time.
Meta announced Thursday that Instagram will introduce new notifications for parents. The aim is to identify red flags in teens’ search queries. Parents will receive notifications if supervised teens are repeatedly trying to access information about suicide or self-harm within a short period of time. This new feature will enhance teen safety measures on Instagram and will first be introduced in some English-speaking countries later this year. Instagram last year set all users under 18 to a strict 13+ account setting that requires parental approval to change.
In its announcement, Meta stated that Instagram will begin alerting parents if their child repeatedly searches for terms related to suicide or self-harm within a short period of time. The company did not specify the time or number of searches. The feature will roll out next week in Australia, Canada, the UK, and the US, and will expand to other regions later this year.
These alerts will only be available to parents who have enabled Instagram's parental supervision tools. Both parents and teens connected through supervision will be notified in advance before the feature goes live. If a teen repeatedly searches for phrases that promote suicide or self-harm, indicate an intent to self-harm, or include similar keywords, a notification will be sent to the parent.