IT Brief Asia - Technology news for CIOs & IT decision-makers
Story image

Nearly half of developers say over 50% of code is AI-generated

Yesterday

Cloudsmith's latest report shows that nearly half of all developers using AI in their workflows now have codebases that are at least 50% AI-generated.

The 2025 Artifact Management Report from Cloudsmith surveyed 307 software professionals in the US and UK, all working with AI as part of their development, DevOps, or CI/CD processes. Among these respondents, 42% reported that at least half of their current codebase is now produced by AI tools.

Despite the large-scale adoption of AI-driven coding, oversight remains inconsistent. Only 67% of developers who use AI review the generated code before every deployment. This means nearly one-third of those working with AI-assisted code are deploying software without always performing a human review, even as new security risks linked to AI-generated code are emerging.

Security concerns

The report points to a gap between the rapid pace of AI integration in software workflows and the implementation of safety checks and controls. Attacks such as 'slopsquatting'—where malicious actors exploit hallucinated or non-existent dependencies suggested by AI code assistants—highlight the risks when AI-generated code is left unchecked.

Cloudsmith's data shows that while 59% of developers say they apply extra scrutiny to AI-generated packages, far fewer have more systematic approaches in place for risk mitigation. Only 34% use tools that enforce policies specific to AI-generated artifacts, and 17% acknowledge they have no controls in place at all for managing AI-written code or dependencies.

"Software development teams are shipping faster, with more AI-generated code and AI agent-led updates," said Glenn Weinstein, CEO at Cloudsmith. "AI tools have had a huge impact on developer productivity, which is great. That said, with potentially less human scrutiny on generated code, it's more important that leaders ensure the right automated controls are in place for the software supply chain."

Developer perceptions

The research reveals a range of attitudes towards AI-generated code among developers. While 59% are cautious and take extra steps to verify the integrity of code created by AI, 20% said they trust AI-generated code "completely." This suggests a marked difference in risk appetite and perception within developer teams, even as the majority acknowledge the need for vigilance.

Across the sample, 86% of developers reported an increase in the use of AI-influenced packages or software dependencies in the past year, and 40% described this increase as "significant." Nonetheless, only 29% of those surveyed felt "very confident" in their ability to detect potential vulnerabilities in open-source libraries, from which AI tools frequently pull suggestions.

"Controlling the software supply chain is the first step towards securing it," added Weinstein. "Automated checks and use of curated artifact repositories can help developers spot issues early in the development lifecycle."

Tooling and controls

The report highlights that adoption of automated tools specifically designed for AI-generated code remains limited, despite the stated importance of security among software development teams. While AI technologies accelerate the pace of software delivery and updating, adoption of stricter controls and policy enforcement is not keeping up with the new risks posed by machine-generated code.

The findings indicate a potential lag in upgrading security processes or artifact management solutions to match the growing use of AI in coding. Developers from a range of industries—including technology, finance, healthcare, and manufacturing—participated in the survey, with roles spanning development, DevOps management, engineering, and security leadership in enterprises with more than 500 employees.

The full Cloudsmith 2025 Artifact Management Report also explores other key issues, including how teams decide which open-source packages to trust, the expanding presence of AI in build pipelines, and the persistent challenges in prioritising tooling upgrades for security benefits.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X