LONDON – Instagram said it’s deploying new tools to protect young people and combat sexual extortion, including a feature that will automatically blur nudity in direct messages.
The social media platform said in a blog post Thursday that it's testing out the features as part of its campaign to fight sexual scams and other forms of “image abuse,” and to make it tougher for criminals to contact teens.
RELATED | FBI warns parents, children about growing threat of sextortion
This follows Gov. DeSantis’s signing of bills to enhance and create criminal punishments for people who abuse children.
Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.
“On the parental end yes I would love for that to come up as a blurred image,” local parent Stephanie Hansen said.
Instagram and other social media companies have faced growing criticism for not doing enough to protect young people. Mark Zuckerberg, the CEO of Instagram's owner Meta Platforms, apologized to the parents of victims of such abuse during a Senate hearing earlier this year.
Meta, which is based in Menlo Park, California, also owns Facebook and WhatsApp but the nudity blur feature won’t be added to messages sent on those platforms.
Instagram said scammers often use direct messages to ask for “intimate images.” To counter this, it will soon start testing out a nudity-protection feature for direct messages that blurs any images with nudity “and encourages people to think twice before sending nude images.”
“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.
The feature will be turned on by default globally for teens under 18. Adult users will get a notification encouraging them to activate it.
Images with nudity will be blurred with a warning, giving users the option to view them. They’ll also get an option to block the sender and report the chat.
For people sending direct messages with nudity, they will get a message reminding them to be cautious when sending “sensitive photos." They'll also be informed that they can unsend the photos if they change their mind, but that there's a chance others may have already seen them.
As with many of Meta’s tools and policies around child safety, critics saw the move as a positive step, but one that does not go far enough.
“I think the tools announced can protect senders, and that is welcome. But what about recipients?" said Arturo Béjar, former engineering director at the social media giant who is known for his expertise in curbing online harassment. He said 1 in 8 teens receives an unwanted advance on Instagram every seven days, citing internal research he compiled while at Meta that he presented in November testimony before Congress. "What tools do they get? What can they do if they get an unwanted nude?”
Hansen agreed. She said she doesn’t think this feature will deter people from sending nudity to people under 18.
“They’re going to do what they want to do,” Hansen said. “As it is right now when we get graphic images that may be of violent or a nature you don’t want to see they go ahead and blur it now so why not do that with a child getting a nude.”
Instagram said it's working on technology to help identify accounts that could be potentially be engaging in sexual extortion scams, “based on a range of signals that could indicate sextortion behavior.”
To stop criminals from connecting with young people, it's also taking measures including not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide teens from these accounts.
The announcement of these changes came a day after DeSantis signed five bills all aimed at protecting children, like House Bill 1545 which prohibits adults from communicating with a minor that includes explicit or descriptions of sexual activity.
READ | Gov. DeSantis signs 5 bills to enhance, create additional criminal punishments for abusing children
Dwann Holmes, a social media expert with Brand On Demand Media, said she appreciates the governments and Instagram making bills and changes to help protect children.
“I think it’s great,” Holmes said.
But she said for all of these changes to work, parents have to be involved with their kids.
“We’ve got to educate our children. But we also have to make our children feel comfortable regarding anything that makes them uncomfortable online, we have to have the doors open for communication,” Holmes said.
Something Hansen and her husband, David, agree with.
“Paying attention to what’s going on in your kids’ lives may help more,” David said.
Instagram also announced sextortion accounts will be shared with other tech companies as well.
__
AP Technology Writer Barbara Ortutay in Oakland, California, contributed to this report.