2 months, 1 week ago

AI action plans should be slowed until safeguards for children in place – NSPCC

Sign up to our free weekly IndyTech newsletter delivered straight to your inbox Sign up to our free IndyTech newsletter Sign up to our free IndyTech newsletter SIGN UP I would like to be emailed about offers, events and updates from The Independent. The NSPCC said generative AI is already being used to create sexual abuse images of children, and urged the Government to consider adopting specific safeguards into legislation to regulate AI. “The NSPCC and the majority of the public want tech companies to do the right thing for children and make sure the development of AI doesn’t race ahead of child safety. AI companies must prioritise the protection of children and the prevention of AI abuse imagery above any thought of profit Derek Ray-Hill “We have the blueprints needed to ensure this technology has children’s wellbeing at its heart, now both Government and tech companies must take the urgent action needed to make generative AI safe for children and young people.” International conference the AI Action Summit is due to take place in Paris next month. Derek Ray-Hill, interim chief executive at the Internet Watch Foundation, which seeks out and helps remove child sexual abuse imagery from the internet, said existing laws, as well as future AI legislation, must be made robust enough to ensure children are protected from being exploited by the technology.

The Independent

Discover Related