Hollywood has ‘helped to fan flames of fear about AI’, peers hear
The IndependentSign up for the View from Westminster email for expert analysis straight to your inbox Get our free View from Westminster email Please enter a valid email address Please enter a valid email address SIGN UP I would like to be emailed about offers, events and updates from The Independent. If like me, you are from a certain generation, these seeds of fear and fascination of the power of artificial intelligence have long been planted by numerous Hollywood movies picking on our hopes and fears of what AI could do to us Lord Ranger Some peers expressed concerns about generative AI, including the need to ensure artists whose work is used as a prompt are fairly paid, and that the technology should be prevented from drawing on images of child sexual abuse. He told the Lords: “If like me, you are from a certain generation, these seeds of fear and fascination of the power of artificial intelligence have long been planted by numerous Hollywood movies picking on our hopes and fears of what AI could do to us.” He cited “unnerving subservience” of HAL 9000 in 2001: A Space Odyssey, and “the ultimate hellish future of machine intelligence taking over the world in the form of Skynet” from the Terminator movies. Lord Ranger added: “These and many other futuristic interpretations of AI helped fan the flames in the minds of engineers, computer scientists and super-geeks, many of who created the biggest tech firms in the world.” While he said he was supportive of the aims of the Bill and there may be a long-term need for regulatory guidance, Lord Ranger said he did not believe it was possible to regulate AI through a single authority. The Tory peer said: “This will not … help us work hand-in-hand with industry and trade bodies to build trust and confidence in the technology.” Other peers gave their backing to the Bill, with crossbench peer Lord Freyberg telling the upper chamber: “It stands to reason that if artists’ IP is being used to train these models, it is only fair that they be compensated, credited and given the option to opt out.” Fellow crossbencher Baroness Kidron, meanwhile, said she wanted to see “more clarity that material that is an offence such as creating viruses, CSAM, or inciting violence are offences whether they are created by AI or not.” The filmmaker and children’s rights campaigner cited a report by the Stanford Internet Observatory, which identified “hundreds of known images of children sexual abuse material in an open data set used to train popular AI text-to-text models”.