Everyone is concerned about safety when using social media and other online communication tools. Unwanted nude image transfers are among the most common and destructive forms of online bullying and harassment.
Bumble has employed machine learning to shield its users from obscene images since 2019. The tool analyzes pictures provided from matches to see if they include any objectionable material. Although it was primarily made to detect unwanted nude photos, it may also flag naked selfies and pictures of weapons, both of which are prohibited on Bumble.
Using your phone to send non-consensual nude photos is known as Cyberflashing. The dating app Bumble is working to stop digital crime on its platform and is also advocating for legislation in the United States and the United Kingdom.
In 2019 Bumble launched its first artificial intelligence tool, "Private Detector," which notifies users when they get offensive photos and instantly blurs the images. The dating app is now releasing a tool version to the larger tech community.
Why is There a Need For Cyberflashing?
According to a British government research report from March 2022, 76% of females between the ages of 12 and 18 received unsolicited nude images of boys or men. In addition, 95% of people in England and Wales under the age of 44 believe more should be done to combat cyber flashing, according to Bumble's own study, which indicated that nearly half of 18 to 24-year-olds have been exposed to a non-consensual sexual photograph.
How Does It Work?
Bumble Inc. has used the latest technological developments and Artificial Intelligence (AI) to help give its community users the tools and resources they need to have a secure experience on their platforms. It was launched on both platforms Bumble and Badoo in response to the rise of Cyberflashing.
When an obscene image is sent, AI can identify it. The user then has the option of viewing, deleting, or reporting the image once the picture has been automatically blurred.
To improve safety and accountability online in the battle against abuse and harassment, digital firms should modify it and incorporate elements of their own. Bumble has made a version of Private Detector openly available on GitHub. You can download Bumble for free from Google Play and App Store. Bumble connects individuals for networking, dating, and companionship across 150 countries and has over 72 million users.
For the past few years, Bumble has waged a campaign in the UK and the US against cyber flashing. The CEO and founder of the app, Whitney Wolfe Herd contributed to the passage of HB 2789, a Texas law that makes non-consensual nudity photos illegal.
Since then, similar legislation in Virginia and California was passed with the aid of the dating app. Bumble has been pushing for the criminalization of cyber flashing in England and Wales, and the government stated in March 2022 that it would do so under the proposed laws, with offenders subject to up to two years in prison.
This Private Detector version is made public under the Apache License, so anyone can use it to blur obscene images.
With the help of Bumble, you can use social media to meet friends, date, and improve your career all at once. It has its headquarters in Austin, Texas, and was established in 2014 by Whitney Wolfe Herd. The location-based social app facilitates communication between interested users. People can interact and create honest and beneficial connections. In same-sex matches, any person can send a message first; however, in heterosexual matches, only female app users can initiate contact with matched male users. Users can search for friends or love matches in "BFF mode" after signing up using their phone number or Facebook profile. Business communications are facilitated through Bumble Bizz.