Sexually explicit deepfakes: women are more likely to be exploited



Abuse of artificial intelligence tools leads to sexually explicit material of real people, according to a survey for a cyber firm. It collected data from over 2000 Brits and found that half were worried about becoming a victim of deepfake pornography, and near one in ten, 9 per cent, reported either being a victim of it, knowing a victim, or both.

The anti-virus and internet security product company ESET points to the rising problem of deepfake pornography, as recently highlighted by explicit deepfakes of the US singer Taylor Swift being viewed millions of times. The firm reports a new form of image-based sexual abuse in the UK, quoting at least 60pc of all revenge pornography victims being women (according to the UK Council for Internet Safety). In the rcently-passed Online Safety Act, creating or inciting the creation of deepfake pornography became a criminal offence. However, the survey suggests that this has not done much to alleviate fears around the tech, as most, 61pc of women were reporting concern about being a victim of it, in comparison to less than half (45pc) of men.

Near two in five (39pc) of those surveyed by Censuswide believe that deepfake pornography is a significant risk of sending intimate content, yet about a third (34pc) of adults have still sent them. Of those that do, the research suggests that a majority, 58pc regret sharing them, whether they say ‘Yes, I would never send an intimate photo or video again’ or ‘Yes, but I would send an intimate photo or video again’.

The percentage of people sending intimate images or videos drops to 12pc in the under-18s, perhaps due to the fact that a majority, 57pc of teenagers surveyed are concerned about being a victim of deepfake pornography.

Despite interest in deepfakes soaring, people are still taking risks, the firm suggests, as just under one third (31pc) admitted to sharing intimate images with their faces visible. The research found that the average age at which someone receives their first sexual image is 14.

Jake Moore, Global Cybersecurity Advisor, ESET said: “These figures are deeply worrying as they show that people’s online habits haven’t adjusted to deepfakes yet. Digital images…

Source…