Tech support

Bumble Launches Dick-Pic-Fighting AI Into The Open-Source Wilderness

Bumble can only protect you from dick pics on its own apps. But now its image detection AI, known as Private Detector, can give every app the power to stop cyber flashers for you.

First released in 2019 exclusively on Bumble’s apps, Private Detector automatically blurs inappropriate images and gives you insight into possible incoming obscenities. This gives you the option to view, block or even report the image. On Monday, Bumble released an improved version of Private Detector into the internet wilderness, offering the tool for free to app makers around the world through an open-source repository.

Read more: Bumble will use AI to protect you from unwanted dick pics

Private Detector achieved over 98% accuracy in both offline and online testing, the company said, and the latest iteration was designed for efficiency and flexibility so a wider range of developers can use it. utilize. Private Detector’s open-source package includes not only the source code for developers, but also a ready-to-use model that can be deployed as-is, comprehensive documentation, and a whitepaper about the project.

“Security is at the heart of everything we do and we want to use our product and technology to help make the internet a safer place for women,” said Rachel Hass, VP of Member Safety. from Bumble, in an email. “Open-sourcing this feature is about standing firm in our belief that everyone deserves healthy and fair relationships, respectful interactions, and friendly relationships online.”

Bumble's main presentation template is a chart divided into 4 sections, each representing a layer of the image detection process used by the open source tool Private Detector.

Bumble said the company’s decade of work in machine learning has allowed it to create a flexible new architecture for its Private Detector neural network that’s both faster and more accurate than its 2019 iteration. New to Private Detector is a binary classifier based on EfficientNetv2 that increases the speed and efficiency of detector training, while working in tandem with the other layers of the tool for faster overall execution.

Bumblebee

The Private Detector AI model isn’t Bumble’s only strategy to combat online sexual harassment and cyberflashing. The company has repeatedly pushed for legislation to thwart the world’s least desirable camera users. In a 2018 survey, Bumble found that one in three women on Bumble received unsolicited lewd photos from someone, and 96% were unhappy to see those photos. Since then, Bumble has successfully lobbied to get anti-cyberflashing laws on the books in Texas and Virginia and is currently pushing similar measures in four other states.

“Bumble was one of the first apps to fight cyberblinking by giving our community the power to decide by consensus whether they’d like to see certain photos and creating a safety standard if they didn’t. We’ve worked to fight against cyberblinking and to help create more online accountability for years, but this issue is bigger than just one company,” said Payton Iheme, public policy manager at Bumble.

“We cannot do this alone.”

And the company may not have to. By offering Private Detector for free, Bumble may have just summoned a swarm of support to its cause.