AI tools are making it easier to create and disseminate deepfake imagery, and a new study from Monash University has revealed insights into the experience of both victims and perpetrators of sexualized deepfake abuse. The research, published in the Journal of Interpersonal Violence, is the first of its kind to include interviews with both perpetrators and victims. The goal was to understand patterns in abuse in Australia and motivations, including how people who engage in these harms rationalize and minimize their actions.
New report uncovers perpetrator and victim perspectives on sexualized deepfake abuse
Tech News
-
HighlightsFree Dark Web Monitoring Stamps the $17 Million Credentials Markets
-
HighlightsSmart buildings: What happens to our free will when tech makes choices for us?
-
AppsScreenshots have generated new forms of storytelling, from Twitter fan fiction to desktop film
-
HighlightsDarknet markets generate millions in revenue selling stolen personal data, supply chain study finds
-
SecurityPrivacy violations undermine the trustworthiness of the Tim Hortons brand
-
Featured HeadlinesWhy Tesla’s Autopilot crashes spurred the feds to investigate driver-assist technologies – and what that means for the future of self-driving cars

