Instagram will alert parents if teens repeatedly search for suicide or self-harm contents

Instagram will soon notify parents if their teen repeatedly searches for suicide- or self-harm-related terms within a short period, the company announced Thursday. The feature will roll out in the coming weeks to users enrolled in parental supervision in the U.S., U.K., Australia and Canada, with other regions to follow.

Instagram-introducing-new-restrictions-to-teens
Photo: Collected

According to TechCrunch, the Meta-owned platform already blocks such content from search results, but says the new alerts are meant to inform parents if their teen continues attempting those searches. Triggers may include phrases encouraging self-harm or indicating potential risk.

Parents will receive notifications via email, text or WhatsApp, along with an in-app alert that includes guidance on how to approach the conversation.

The update comes as Meta faces multiple lawsuits in U.S. courts alleging its platforms contribute to teen mental health harm and addiction. During recent testimony in federal court in California, Instagram head Adam Mosseri was questioned about delays in rolling out certain teen safety features. In a separate case in Los Angeles, internal Meta research revealed that parental supervision tools had limited impact on reducing compulsive social media use, particularly among teens experiencing stressful life events.

Instagram said it worked with its Suicide and Self-Harm Advisory Group to determine the threshold for alerts, requiring multiple searches in a short timeframe while aiming to avoid excessive or unnecessary notifications. The company added that it plans to expand the alerts in the future to include instances where teens attempt to engage the platform’s AI in conversations about suicide or self-harm.