Since the rise of generative AI, enterprises have eagerly embedded intelligent capabilities into their applications. However, in doing so, many have unknowingly resurrected an old problem: siloed intelligence.
Each application now boasts its own AI engine—its own chat interface, recommendation system, and document processor. While powerful, these tools often operate in isolation, disconnected from the broader data ecosystem. The result? Fragmented insights, duplicated efforts, and missed opportunities.
At Hitachi Solutions, we’ve observed this challenge firsthand. Companies are investing heavily in AI, yet without a unified strategy, they struggle to scale effectively. Imagine a company fully invested in AI agents spread across CRM, data warehouses, knowledge management agents, or customer-built agents for very specific tasks. Each agent works fabulously in its own way but operates on an island, which can be a roadblock for collaboration and insights.
The key isn’t just implementing more AI—it’s fostering interoperability. I recently had the chance to talk about this topic in a fireside chat with my colleague, Jon Loring. You can watch the full video by clicking here.
To unlock AI’s full potential, organizations must treat data as a strategic asset, which involves curating, securing, and – most importantly – connecting it. When AI engines can share context, learn from each other, and reason across systems, their value multiplies.
This challenge is not merely technical—it’s also a leadership imperative. As new models emerge weekly, the allure of adopting the latest tool is strong. However, true ROI comes from building a foundation that can absorb change, adapt quickly, and deliver insights where they matter most.
We are entering a new era of application development—one where every business process can be enhanced by AI. To achieve this, we must break down silos and construct bridges between our data, tools, and teams.
According to McKinsey, 20% of employee time is lost searching for data. AI agents can dramatically reduce this by surfacing insights instantly.
The Proliferation of AI Agents
AI agents are rapidly spreading across platforms:
- Salesforce: Agentforce
- Google: Agentspace
- Snowflake: Cortex Agents
- Glean: Glean Agents
However, these agents often operate in isolation, leading to fragmented intelligence. This mirrors the SaaS sprawl of the past, where enterprise data was trapped in disconnected apps, and now, AI-generated insights are locked in isolated agents.
Duplication of effort is a growing issue: Without a shared communication layer, enterprises face the manual task of bridging gaps between agents.
The Path to Interoperability
The solution lies not in centralization, but in interoperability:
- Shared, event-driven communication layers (like Apache Kafka or Flink) allow agents to subscribe to and act on real-time data streams. This enables dynamic, cross-platform collaboration without vendor lock-in.
- Agent Registry + Data Streaming Platform:
- Agents register their capabilities and consume/publish events via dedicated topics.
- Stream processing intelligently routes insights to relevant agents using LLM-based mapping.
The Future of AI is Interconnected
Enterprises must begin treating agents as interoperable, event-driven data products—not isolated tools. By doing so, they can maximize the potential of AI, ensuring it serves as a bridge to innovation rather than a barrier.
Please click here to reach out to us to continue this conversation. I’d love to hear from you.