The popular women-first dating app Bumble is open-sourcing its AI tool — Private Detector.

The new tool works by automatically blurring a potential nude image shared within a chat on Bumble. Users will be notified, and it is up to them to decide whether to view or block the image.

“Bumble’s Data Science team has written a white paper explaining the technology of Private Detector and has made an open-source version of it available on GitHub,” the company said in a blog post.

“It is our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” it added.

This version of the Private Detector is released under the Apache License so that it is available for everyone to implement as the standard for blurring lewd images as it is or after fine-tuning it with additional training samples.

To help address this more significant issue of cyber flashing, Bumble said it teamed up with legislators from across the aisle in 2019 in Texas to pass a bill that effectively made sending unsolicited lewd photos a punishable offense.

Since the passing of HB 2789 in Texas in 2019, Bumble has continued to advocate for similar laws across the US and the globe successfully.

In 2022, Bumble reached another milestone in public policy by helping to pass SB 493 in Virginia and most recently SB 53 in California, adding another layer of online safety in one of the most populous states in the US.

READ
Australia to Ban Social Media for Under-16s, Setting New Global Standard