Google expands its bug bounty program to target generative AI attacks

With concerns around generative AI ever-present, Google has announced an expansion of its Vulnerability Rewards Program (VRP) focused on AI-specific attacks and opportunities for malice. As such, the company released updated guidelines detailing which discoveries qualify for rewards and which fall out of scope. For example, discovering training data extraction that leaks private, sensitive information falls in scope, but if it only shows public, nonsensitive data, then it wouldn’t qualify for a reward. Last year, Google gave security researchers $12 million for bug discoveries.

Google explained that AI presents different security issues than their other technology — such as model manipulation and unfair bias — requiring new guidance to mirror this. “We believe expanding the VRP will incentivize research around AI safety and security, and bring potential issues to light that will ultimately make AI safer for everyone,” the company said in a statement. “We’re also expanding our open source security work to make information about AI supply chain security universally discoverable and verifiable.”

AI companies, including Google, gathered at the White House earlier this year, committing to greater discovery and awareness of AI’s vulnerabilities. The company’s VRP expansion also comes ahead of a “sweeping” executive order from President Biden reportedly scheduled for Monday, October 30, which would create strict assessments and requirements for AI models before any use by government agencies.

About Ajay Sharma 1347 Articles
My name is Ajay Sharma and i am a seasoned content writer with over a decade of experience in creating engaging and informative articles. Based in Jaipur, Rajasthan, My blog on onhike.com covers a wide range of topics, including Technology, Sports, Lifestyle, Finance, and Health. With a deep passion for writing and a keen interest in current trends and innovations, My aims to provide valuable insights and meaningful content to readers.

Be the first to comment

Leave a Reply

Your email address will not be published.


*