Popular search engines like Google and Microsoft’s Bing are reportedly showing non-consensual deepfake porn at the top of search results alongside tools that advertise the ability to create such content.

In non-consensual deepfake pornography, someone’s likeness is used to create the impression that they are engaged in a sexual act by manipulating images digitally.

In an analysis by NBC News, it was discovered that when searching for various women’s names together with the word “deepfakes,” as well as more general terms like “deepfake porn” or “fake nudes,” Google and other major search engines returned deepfake pornographic photos with the likenesses of female celebrities as the first results.

The researchers used Google and Bing to search for 36 well-known female celebrities using a mix of their names and the term “deepfakes”.

Upon reviewing the results, it was discovered that the top Google and top Bing results for 34 and 35 of those queries, respectively, contained links to deepfake videos and non-consensual deepfake photos.

Over half of the top results were links to a popular deepfake website or a competitor, the report mentioned.

Buy Me A Coffee

Searching for “fake nudes” showed links to numerous applications and programs to create and observe nonconsensual deepfake pornography.

These links were among the first six results on Google.

In a search for “fake nudes” on Bing, several nonconsensual deepfake websites and tools appeared.

The report found that users could view and create this type of pornographic content before any news reports explained the harmful nature of non-consensual deepfakes.

READ
Samsung to Optimize Galaxy AI for New Foldables, to Bring ‘Live Translate’ on 3rd-Party Apps

In Google and Bing, victims of deepfake porn can request the removal of the content by filling out a form.

The report did draw attention to the fact that search engines such as Google and Microsoft’s Bing don’t seem to be regularly monitoring their search engines for misuse.

“We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search,” a Google spokesperson was quoted as saying. “Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for,” it added.

In recent times, deepfake videos of Bollywood stars like Rashmika Mandanna, Alia Bhatt, Priyanka Chopra, Katrina Kaif, etc went viral.