Confluent links AI agents & boosts anomaly detection
Confluent has added new features to Confluent Intelligence aimed at connecting AI agents over real-time data streams and spotting system issues earlier with anomaly detection.
The update adds support for the Agent2Agent protocol in Confluent's Streaming Agents and introduces Multivariate Anomaly Detection within its built-in machine learning functions. Together, the changes are designed to link AI systems across an organisation and improve monitoring of complex, fast-moving data.
Agent connections
Many organisations are experimenting with AI agents across products and teams. IDC forecasts that "By 2026, 40% of all G2000 job roles will involve working with AI agents, redefining long-held traditional entry-, mid-, and senior-level positions." A growing challenge is how those agents share context, coordinate tasks, and leave a clear record of the actions they took.
Streaming Agents now use the Agent2Agent protocol to trigger and coordinate external AI agents using real-time streams. Confluent positions this as a way to connect agents across tools and platforms without relying on point integrations. Streaming Agents can also connect AI agents to real-time data using Anthropic's Model Context Protocol.
Confluent says Streaming Agents can analyse information produced in agent frameworks such as LangChain and work with data platforms including BigQuery, Snowflake, and Databricks. The same flow can then trigger actions in enterprise platforms such as ServiceNow and Salesforce workflows.
Confluent is also emphasising governance and traceability for agent activity. With Agent2Agent support, teams can capture agent actions in an immutable log, replay them for audit and review, and use Apache Kafka to orchestrate communications between agents.
Sean Falconer, Head of AI at Confluent, said the company sees a shift away from AI that mainly analyses past data.
"If you want to be competitive, your AI can't be looking in the rearview mirror," Falconer said. "You need a system of AI agents that work together and constantly learn and share insights in real time. Confluent Intelligence connects teams' AI investments and systems no matter where they're built-so AI can automatically react to live data, take action, coordinate systems, and escalate to team members as needed."
Agent2Agent support in Streaming Agents is available in Open Preview. Confluent did not provide a timeline for general availability.
Data monitoring
The second part of the update focuses on anomaly detection for operational monitoring. Companies commonly track metrics such as latency, throughput, error rates, CPU, and memory. Confluent argues that traditional approaches often assess these metrics in isolation and rely on batch analysis of historical baselines, which can produce false positives when data is noisy.
Multivariate Anomaly Detection is designed to analyse related metrics together and identify unusual patterns in streaming data. Confluent says this reduces false positives and surfaces issues faster than single-metric approaches.
According to Confluent, the feature identifies a system's "healthy state" and adapts as data changes. Teams do not need to build or update a model because it learns as it observes new data. The company also says it can ignore outliers and one-off glitches that would otherwise skew averages.
As an example, Confluent points to analysing CPU, memory, and latency as a group rather than separately. It says this can highlight complex patterns that would be missed with individual metric checks, and flag data points that drift too far from "true normal" as anomalies.
Multivariate Anomaly Detection is available through an early access process. Confluent has not said how many customers are using it.
Wider rollout
The Confluent Intelligence update comes alongside other Confluent Cloud announcements, including a migration tool called Kafka Copy Paste and a feature called Queues for Kafka. Confluent did not provide detailed technical specifications for either item.
Confluent continues to expand its portfolio around real-time data movement and analysis, with Apache Kafka central to its platform. The latest update puts more emphasis on how AI systems consume streams and act on them, and on how operations teams can detect issues earlier as data changes rapidly.
Falconer said: "You need a system of AI agents that work together and constantly learn and share insights in real time."