1 year, 6 months ago

Microsoft will pay users a handsome fee if they can outsmart Bing AI, make it go rogue

Microsoft has announced that they will pay users a handsome fee, if they can outsmart their Bing AI products, make it go rogue or find any other bug. Users who are able to expose any such vulnerabilities will be awarded anything $2000-$15,000 based on how critical the bug is In a bold move, Microsoft has unveiled a groundbreaking initiative to enhance the security of its Bing AI products, challenging the tech-savvy community to expose potential vulnerabilities within the AI framework and, remarkably, putting its money where its mouth is. In a recent blog update, the software behemoth has introduced a novel “bug bounty” programme, committed to rewarding security researchers with bounties ranging from $2,000 to $15,000 for identifying “vulnerabilities” in its Bing AI suite. To partake in this programme, Bing users must alert Microsoft to previously undisclosed vulnerabilities, as per the company’s specified criteria, categorized as either “important” or “critical” for security.

Firstpost

Discover Related