Microsoft gives deepfake porn victims a tool to scrub images from Bing search


The advancement of generative AI tools has created a new problem for the internet: the proliferation of synthetic nude images resembling real people. On Thursday, Microsoft took a major step to give revenge porn victims a tool to stop its Bing search engine from returning these images.

Microsoft announced a partnership with StopNCII, an organization that allows victims of revenge porn to create a digital fingerprint of these explicit images, real or not, on their device. StopNCII’s partners then use that digital fingerprint, or “hash” as it’s technically known, to scrub the image from their platforms. Microsoft’s Bing joins Facebook, Instagram, Threads, TikTok, Snapchat, Reddit, Pornhub and OnlyFans in partnering with StopNCII, and using its digital fingerprints to stop the spread of revenge porn.

In a blog post, Microsoft says it already took action on 268,000 explicit images being returned through Bing’s image search in a pilot through the end of August with StopNCII’s database. Previously, Microsoft offered a direct reporting tool, but the company says that’s proven to be not enough.

“We have heard concerns from victims, experts, and other stakeholders that user reporting alone may not scale effectively for impact or adequately address the risk that imagery can be accessed via search,” said Microsoft in its blog post on Thursday.

You can imagine how much worse that problem would be on a significantly more popular search engine: Google.

Google Search offers its own tools to report and remove explicit images from its search results, but has faced criticism from former employees and victims for not partnering with StopNCII, according to a Wired investigation. Since 2020, Google users in South Korea have reported 170,000 search and YouTube links for unwanted sexual content, Wired reported.

The AI deepfake nude problem is already widespread. StopNCII’s tools only work for people over 18, but “undressing” sites are already creating problems for high schoolers around the country. Unfortunately, the United States doesn’t have an AI deepfake porn law to hold anyone accountable, so the country is relying on a patchwork approach of state and local laws to address the issue.

San Francisco prosecutors announced a lawsuit in August to take down 16 of the most “undressing” sites. According to a tracker for deepfake porn laws created by Wired, 23 American states have passed laws to address nonconsensual deepfakes, while nine have struck down proposals.



Source link

About The Author

Scroll to Top