TikTok and Bumble have joined an initiative to prevent the sharing of non-consensual intimate images online.

The social media platforms partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta.

TikTok, Bumble, Facebook, and Instagram will detect and block any images that are included in StopNCII.org’s bank of hashes, reports Engadget.

The website uses on-device hashing technology through which people being threatened with intimate image abuse can create unique identifiers of their images, (also known as ‘hashes’ or digital fingerprints).

This process takes place on their device. To protect users’ privacy, StopNCII.org only uploads a unique string of letters and numbers rather than actual files, according to the report.

Moreover, hashes submitted to StopNCII.org are shared with participating partners.

If an image or video uploaded to TikTok, Bumble, Facebook, or Instagram matches a corresponding hash and “satisfies partner policy requirements”, then the file will be forwarded to the platform’s moderation team.

When moderators find that the image violates their platform’s rules, they will remove it, and the other partner platforms will block the image as well, said the report.

The tool has been available for a year, and over 12,000 people have used it to prevent intimate videos and images from being shared without permission.

Users have created more than 40,000 hashes to date, the report added.

READ
Australia to Ban Social Media for Under-16s, Setting New Global Standard