Microsoft engineer sounds alarm on AI image-generator to US officials and company’s board
Associated PressA Microsoft engineer is sounding alarms about offensive and harmful imagery he says is too easily made by the company’s artificial intelligence image-generator tool, sending letters on Wednesday to U.S. regulators and the tech giant’s board of directors urging them to take action. Microsoft said it is committed to addressing employee concerns about company policies and that it appreciates Jones’ “effort in studying and testing our latest technology to further enhance its safety.” It said it had recommended he use the company’s own “robust internal reporting channels” to investigate and address the problems. Jones, a principal software engineering lead whose job involves working on AI products for Microsoft’s retail customers, said he has spent three months trying to address his safety concerns about Microsoft’s Copilot Designer, a tool that can generate novel images from written prompts. “For example, when using just the prompt, ‘car accident’, Copilot Designer has a tendency to randomly include an inappropriate, sexually objectified image of a woman in some of the pictures it creates.” Other harmful content involves violence as well as “political bias, underaged drinking and drug use, misuse of corporate trademarks and copyrights, conspiracy theories, and religion to name a few,” he told the FTC. His letter to Microsoft’s board asks it to launch an independent investigation that would look at whether Microsoft is marketing unsafe products “without disclosing known risks to consumers, including children.” This is not the first time Jones has publicly aired his concerns.