Meta announced new tools on Thursday which aim to combat sexual extortion and image-based abuse on its Instagram platform.
A new setting on Instagram DMs will blur photos which it identifies as containing nudity, the company said. The feature will be automatically turned on for users under age 18 while adults will be able to choose to whether they want to turn on the setting.
According to Meta, this tool will prevent users from seeing undesirable nude images and will also protect them from scammers who may send such images.
Users who choose to send nude photos will receive a warning message which will also notify them that they can unsend their photos.
The company further noted that users who either send or receive such images will also receive expert advice on the hazards which may ensue from their actions.
Meta attempts to combat sexual extortion scams
Meta also stated it had developed new technology to combat sexual extortion scams. The company noted that message requests containing such contents would automatically go into the user’s hidden messages folder, with the user not having to see it or know of its existence.
For users under 18 years old the company said it would be taking stricter measures. Accounts which would be deemed as possible sexual extortion scammers would not be showed the message button on a teen’s account.
In cases where users have communicated with scamming accounts Meta stated a pop-up image would appear with information presented by experts about how to act in such cases.