/dq/media/media_files/2025/11/05/snowflake-build-2025-2025-11-05-18-45-52.jpg)
At Snowflake’s BUILD Virtual Media Roundtable 2025, the company detailed its latest product innovations, signalling a strategic shift towards powering enterprise-grade agentic AI at scale.
From data-warehousing pioneer to AI Data Cloud leader, Snowflake is evolving into an AI-driven enterprise engine. Its latest announcements, including the general availability of Snowflake Intelligence and Openflow, major upgrades to Horizon Catalog, and a strategic partnership with SAP, reinforce one message: the future of AI depends on unified, trusted and action-ready data.
Responding to a Dataquest question on global adoption trends, Christian Kleinerman, EVP of Product at Snowflake, said that the need for data unification has become urgent with the rise of AI workloads.
“When organisations build AI agents, they need access to multiple, diverse datasets to provide contextual answers,” Kleinerman told Dataquest. “If some of those datasets aren’t readily available, it triggers a fire drill. Teams rush to unify data assets at the last minute. So the timing for technologies like OpenFlow and our new Stream capabilities couldn’t be better. The need to make data available in one holistic platform is more urgent than ever.”
He added that Snowflake’s architecture is designed to balance accessibility with sovereignty. “We’ve never attempted to move data automatically across regions because customers value control,” he said. “Instead, we give them the building blocks to anonymise, protect and analyse data securely across regions while fully respecting sovereignty requirements.”
Snowflake Intelligence now generally available
At the centre of Snowflake’s announcement is Snowflake Intelligence, now generally available for its global customer base of more than 12,000 organisations. The enterprise intelligence agent is built to make data insights accessible to every employee through simple, conversational interaction.
Snowflake Intelligence connects structured, semi-structured and unstructured data across systems such as databases, documents and collaboration tools like Teams or Slack. It lets employees query information in natural language, understand why trends are occurring, and even take recommended actions within Snowflake’s governed environment.
Powered by advanced models from Anthropic and Snowflake’s internal research team, the platform delivers faster and more accurate analysis. Its new Agent GPA (Goal–Plan–Action) framework detects up to 95 per cent of query errors, and text-to-SQL conversions now run three times faster than before.
In just three months of public preview, over 1,000 enterprises, including Cisco, Toyota Motor Europe, and TS Imagine, have deployed more than 15,000 AI agents using Snowflake Intelligence.
“Our next evolution is about bringing AI to data, allowing every customer to unlock intelligence that is uniquely their own,” said Kleinerman. “This will democratise AI so every employee can make faster, smarter decisions.”
Building a unified data fabric for AI
Snowflake also announced the general availability of Openflow and significant upgrades to Horizon Catalog, both aimed at breaking down data silos and creating a single interoperable data layer across formats and clouds.
Openflow allows enterprises to automate the integration and ingestion of data from virtually any source, ensuring that information remains up to date and accessible for analytics and AI. Horizon Catalog adds open APIs from Apache Polaris (Incubating) and Apache Iceberg REST Catalog, providing a single framework for governance, metadata and context across all enterprise data.
Together, they form what Snowflake calls the enterprise lakehouse, combining governance, security and interoperability without vendor lock-in. Additional previews include Interactive Tables and Streaming Analytics for near real-time insights, Postgres on Snowflake (a managed PostgreSQL service built after the Crunchy Data acquisition), and new business continuity capabilities for managed Iceberg tables.
“Snowflake is effectively turning its data warehouse into a full-fledged, open enterprise lakehouse that is ready for AI workloads,” said Kleinerman during the session.
Empowering developers to build agentic AI faster
To help enterprises build and deploy AI applications more efficiently, Snowflake introduced a set of new developer tools that simplify the entire process from data engineering to AI inference.
- Cortex Code (private preview): An AI assistant inside the Snowflake interface that helps users write, optimise and debug queries in natural language.
- Cortex AISQL (GA): Enables AI inference pipelines to be built directly with SQL and includes AI Redact for detecting and masking sensitive information.
- Workspaces (GA): A unified development environment with Git and VS Code integration for collaborative coding.
- dbt Projects on Snowflake (GA): Lets developers build, test and monitor dbt workflows directly inside Snowflake.
- Snowpark Connect for Apache Spark (GA): Runs Spark workloads securely within the Snowflake engine.
These enhancements reduce development friction and total cost of ownership, while accelerating the rollout of production-ready AI applications that are compliant and secure.
SAP and Snowflake unite mission-critical data with AI innovation
A day after the Build preview, Snowflake announced a major partnership with SAP to connect SAP Business Data Cloud (BDC) and Snowflake’s AI Data Cloud through bidirectional, zero-copy data sharing.
This new SAP Snowflake solution extension will allow customers to analyse semantically rich SAP data alongside non-SAP data in real time without replication. The goal is to harmonise enterprise data and build intelligent, AI-powered applications grounded in trusted business processes.
The joint offering, expected to be generally available in early 2026, allows customers to unify SAP and non-SAP data and simplify AI governance within a shared framework.
“By tightly integrating SAP and Snowflake, we’re making it simple for enterprises to connect their critical business data with AI innovation at scale,” said Kleinerman.
Irfan Khan, President and Chief Product Officer for SAP Data and Analytics, added, “Together, we combine SAP’s leadership in mission-critical applications with Snowflake’s modern data platform to deliver a unified, enterprise-ready experience that extends the value of business data across the ecosystem.”
Industry leaders such as AstraZeneca are already exploring the integration to accelerate breakthroughs through real-time data access and analysis.
Why this matters
Snowflake’s latest innovations position it as a key player in the global race to build enterprise-ready AI infrastructure. The company’s approach is built around three principles: openness, governance and trust.
By unifying data across ecosystems, maintaining strict sovereignty controls and simplifying development, Snowflake aims to remove two of the biggest barriers to AI adoption: data fragmentation and compliance complexity.
For industries such as banking, manufacturing, healthcare and retail, these capabilities could enable faster, safer and more scalable AI adoption.
“The AI wave has turned data unification from a luxury into an imperative,” Kleinerman told Dataquest. “For enterprises, the goal is no longer just to analyse data but to act on it safely, contextually and globally.”
/dq/media/agency_attachments/UPxQAOdkwhCk8EYzqyvs.png)
Follow Us