
Google will no longer accept AI-generated submissions to a program it funded to find bugs in open-source software. However, it is contributing to a separate program that uses AI to strengthen security in open-source code.
The Google Open Source Software Vulnerability Reward Program team is increasingly concerned about the low quality of some AI-generated bug submissions, with many including hallucinations about how a vulnerability can be triggered or reporting bugs with little security impact.
“To ensure our triage teams can focus on the most critical threats, we will now require higher-quality proof (like OSS-Fuzz reproduction or a merged patch) for certain tiers to filter out low-quality reports and allow us to focus on real-world impact,” Google wrote in a blog post.

