AI-generated code introduces plenty of threat into the event course of. A current Sonatype report discovered that AI hallucinated 27% of improve suggestions for open supply initiatives, whereas analysis from Veracode discovered that AI launched safety vulnerabilities in 45% of 80 coding duties throughout 100+ totally different LLMs. Now, new analysis from Black Duck is shedding gentle on one other urgent problem associated to AI-generated code: IP and licensing dangers.
Within the firm’s 2026 Open Supply Safety and Threat Evaluation (OSSRA) report, it analyzed 947 industrial codebases and located that two-thirds of them had license conflicts—the very best share within the historical past of the report. This represents a 12% enhance from final 12 months, which additionally breaks a document for the most important soar within the report’s historical past.
One of many codebases that Black Duck audited contained 2,675 distinct licensing conflicts, indicating the complexity of managing IP has grown exponentially.
“This rise is partly pushed by ‘license laundering,’ the place AI assistants generate code snippets derived from copyleft sources (like GPL) with out retaining the unique license info,” the corporate defined in a weblog submit. For instance, the report exhibits that 17% of open supply elements are coming into codebases exterior of conventional bundle managers, via copy and pasted snippets, direct vendor inclusions, or AI era. This presents a problem, as code that enters this manner could also be invisible to conventional manifest-based scanning instruments.
This 12 months’s OSSRA report additionally discovered that the imply variety of vulnerabilities in code has practically doubled since final 12 months. Eighty-seven % of the codebases had at the very least one vulnerability, 78% had high-risk vulnerabilities, and 44% had critical-risk vulnerabilities.
The corporate defined that it found a “zombie part” drawback when digging into the analysis. Ninety-three % of codebases contained elements that hadn’t seen energetic growth in two years, 92% contained elements that had been at the very least 4 years old-fashioned, and solely 7% of elements in use had been upgraded to the most recent model.
“These deserted elements are a ticking time bomb. When a vulnerability is found in a challenge that hasn’t been touched in years, there’s typically no maintainer left to repair it. Organizations are left with troublesome decisions: fork the challenge, refactor the applying, or settle for the danger,” the researchers wrote.
Black Duck concluded {that a} key takeaway from this 12 months’s report is that there’s a rising hole between AI adoption and governance.
“As regulatory strain mounts from frameworks such because the EU AI Act and Cyber Resilience Act, the ‘ship and overlook’ mannequin of software program supply is not viable. Organizations should transfer towards a mannequin of steady provide chain transparency, the place each part, whether or not human-written, AI-generated, or open supply, is accounted for,” Black Duck stated.
