Instagram's New Safety Feature: Blurring Nude Images and Hiding Teen Profiles from Potential Predators
Meta, the owner of Instagram, is introducing a new safety tool to prevent children from receiving nude pictures and discourage sending them.
Adults will be restricted from starting private conversations with users who claim to be under 18, and the "message" button will not be shown on a teenager's profile to potential "sextortion" even if they're already connected.
This comes after criticism from police chiefs and children's charities regarding Meta's decision to encrypt chats on its Messenger app by default.
Instagram's parent company Meta is implementing new nudity protection features for direct messages, which will blur images containing nudity and prompt users to think before sending explicit content.
The feature will be enabled by default for users under 18, and adults will be encouraged to turn it on.
Meta relies on users' self-reported age for age verification, and adults are currently restricted from initiating conversations with underage users.
However, later this year, Meta will also hide the "message" button for potential "sextortion" attempts on teenagers' profiles, even if the users are already connected.
The debate continues over whether encryption, which protects privacy, makes it harder for companies to detect child abuse.
Sextortion is the act of blackmailing children by threatening to share compromising images with their families or on social media unless money is paid.
In a blog post on Thursday, representatives announced testing new measures to protect teens from such accounts.
These measures include hiding teen accounts from followers, following lists, and search results to make it harder for predators to find them.