Meta, Facebook’s parent company, is testing new tools to protect young people from sextortion and other forms of intimate image abuse on their platforms.

The company also said it is making it hard for scammers to find potential targets on its apps and across the internet.

“While people overwhelmingly use DMs (direct messages) to share what they love with their friends, family or favorite creators, sextortion scammers may also use private messages to share or ask for intimate images,” Meta said in a blog post.

“To help address this, we’ll soon start testing our new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity and encourages people to think twice before sending nude images,” it added.

Buy Me A Coffee

Moreover, the tech giant mentioned that ‘nudity protection’ will be turned on by default for teens under 18 worldwide, and will encourage adults via notification to turn the feature on.

According to the company, when this feature is turned on, people sending images containing nudity will see a message reminding them to be cautious. Those who try to forward a nude image will receive a message encouraging them to reconsider.

In addition, the tech giant is testing new ways to help people spot possible sextortion scams, “encourage them to report and empower them to say no to anything that makes them feel uncomfortable”.

The company is also adding new child safety helplines globally into their in-app reporting flows.

READ
ChatGPT Surpasses 1 Million Users in South Korea: Industry Data