IT Brief Asia - Technology news for CIOs & IT decision-makers
Editorial anxious corporate office hiring freeze ai dashboard illustration

Expected AI gains prompt hiring freezes before returns are proven

Fri, 20th Mar 2026

The Return on AI Institute has published a study on how companies measure the economic returns from artificial intelligence. The survey covers 1,006 C-suite executives across 11 countries and 32 industries.

The findings suggest many businesses are making workforce decisions based on expected AI gains before they have clear evidence that those gains are materialising. Only 2% of organisations have made large headcount cuts tied to actual AI implementation, while almost 60% have already reduced or frozen hiring in anticipation of future productivity improvements.

Produced with Scaled Agile, the research examines what the authors describe as economic maturity in AI. It argues that the biggest gap between strong and weak results is not the type of AI a company adopts, but whether it formally measures value and trains both staff and leaders to use and oversee the technology.

Measurement gap

On that measure, the contrast was stark. Organisations that formally report AI value to boards or investors achieve high value at an 85% rate, compared with 15% for those that neither measure nor report it.

That difference reflects how companies track, combine and communicate the economic impact of AI projects. Firms with clearer oversight appear far more likely to report significant returns.

Training also emerged as a strong dividing line. The report found a 23-percentage-point advantage in achieving high value from AI when both employees and leaders receive AI training.

Even so, many organisations have yet to build those skills. The study found that 58% have not trained employees in basic AI productivity and tool use, while 29% said their leaders lack the understanding needed to drive AI value creation.

Workforce moves

The report raises questions about how companies are making staffing decisions as AI adoption spreads. While businesses have spent heavily on the technology and many report some benefit, the survey suggests the most severe employment actions remain uncommon when linked directly to implemented systems.

That contrasts with the much wider use of hiring freezes and smaller staffing changes made in anticipation of gains that may not yet have been measured. The study's headline finding points to a gap of roughly thirtyfold between companies making cuts based on expected value and those making them based on realised value.

One co-author said the key distinction between leading and lagging organisations lies in management discipline rather than software choice.

"The technology works - 90% of organizations say so. What separates the leaders from everyone else isn't the AI itself. It's whether anyone has the discipline to measure what it's worth and the leadership fluency to act on what they find," said Laks Srinivasan, Co-Founder and CEO, Return on AI Institute.

Types of AI

The survey also examined which forms of AI companies currently value most. Only 9% of organisations identified generative AI as their most valuable AI type, compared with 50% for analytical AI and 40% for rule-based automation.

That ranking suggests older forms of automation and analytics still account for much of the measurable return. Generative AI, by contrast, remains harder for many companies to assess in day-to-day workflows and operating models.

Some 44% of executives said generative AI is the hardest type of AI for which to measure return on investment. At the same time, adopters of agentic AI systems were 22% more likely to report achieving a great deal of value from AI.

The international findings also challenge assumptions about market leadership. Despite the prominence of the US in AI investment and product development, only 38% of US organisations reported getting a "great deal" of value from AI.

That figure was below Germany, the UK, Australia, Japan and the UAE, all of which exceeded 50%. Those countries also showed broadly similar levels of employee training and AI experience.

Thomas H. Davenport, another Co-author, said the pattern fits a familiar trend in technology adoption, where management systems lag behind technical progress.

"We've studied technology adoption in organizations for decades. The pattern here is consistent: the technical capabilities arrive before the management systems to harness them. What's different with AI is how many consequential decisions - especially on workforce - organizations are making before those systems catch up," said Davenport.