Almost one in six UK adults admit to having seen deepfake pornographic images, an annual review has found. The second International AI Safety report, which reviews possible risks from the development of Artificial Intelligence, said the use of AI to generate pornographic images of real people is of “particular concern”. The review noted that 15 per cent of UK adults have seen such content, while the number of cases involving children ‘nudifying’ other children is “rising”. Eleven-years-oldA poll of 4,300 secondary school teachers in England conducted on behalf of The Guardian found that around one in ten were aware of students generating “deepfake, sexually explicit videos” during the last academic year. ‘Exploitation’Last month, the Westminster Government announced that it would criminalise the creation of explicit deepfake images of people without their consent.
Source: The Guardian February 12, 2026 13:49 UTC