Event Recap:
What We Heard at Snowflake Summit 2025 and What It Means for Your Data Strategy
Over four days and three keynotes, Snowflake Summit 2025 unveiled a suite of innovations aimed at reshaping how organizations approach data management and artificial intelligence. As a longtime Snowflake partner, the DAS42 team left the conference excited about several breakthrough features that we foresee empowering our team and our clients to work smarter, faster, and more efficiently over the months and years to come.
Continued Tool Centralization
Perhaps the most exciting announcement for our “hands-on-keyboard” team members was dbt projects directly within Snowflake. This integration will eliminate friction for data teams who have long juggled between platforms, streamlining the analytics engineering workflow in ways that will accelerate project delivery and reduce complexity, in addition to supporting the full-stack approach that DAS42 has now championed for a decade.
Just as game-changing is Openflow, Snowflake’s new managed data movement service is powered by Apache NiFi. This structured, unstructured, and streaming data connector appears to position Snowflake to compete directly with technologies that have previously been valuable partners of Snowflake, such as Fivetran. For those whose data sources are supported by Openflow, this could simplify data architecture while reducing vendor sprawl. Of course, Fivetran’s recent acquisition of Census has brought their total number of built-in connectors to over 500, so we don’t see Fivetran going away anytime soon.
Further encouraging customers to move ingestion workloads into their cloud, Snowflake also addressed a critical pain point with **simplified Snowpipe pricing**, announcing that it is transitioning from a file-plus-compute model to a volume-based pricing model, while reducing costs by an average of 50%. This change makes real-time data ingestion more accessible and predictable for organizations of all sizes.
With the introduction of the “FILE” datatype, alongside new Cortex “AISQL” functions and expanded capabilities for parsing audio, image, and multimedia files, Snowflake opens up exciting possibilities for analyzing unstructured data. Image files don’t even need to be in an internal stage to be accessed by these functions, but can instead live in an external stage (e.g., an AWS S3 bucket), which will help to control storage costs.
The AI Integration Imperative
OpenAI CEO Sam Altman’s presence at the opening keynote underscored the importance of AI adoption. His advice was characteristically direct: when asked about adopting AI, Altman simply said, “Just do it.” He noted that “sometime over the last year we hit a real inflection point for the usability of these models,” and warned that the organizations that start using AI now will move ahead over those who wait to see how things turn out or wait for a better future model.
Altman’s vision is ambitious—he described how employees can currently manage AI agents much like managers interact with junior staff. He predicts that models in the next 2-3 years will be “breathtaking” and spoke of the potential for OpenAI to contribute to scientific advancements.
Governance and Security First
Snowflake’s commitment to responsible AI deployment was evident in their Horizon Catalog enhancements, including lineage visualization, auto-classification, and sensitive data monitoring. The company is also deprecating username-and-password-only authentication—a security move that drew lukewarm applause but represents crucial progress. (Check out our recent post from DAS42 leader Cassidy Stearns to learn more about this change.)
Snowflake cofounder and president of product Benoît Dageville spoke to Snowflake’s approach: “We want to give you flexibility but not overwhelm you,” he shared. “We are committed to keeping Snowflake open and interoperable.”
The Promise of Snowflake Intelligence
The most ambitious announcement was the upcoming public preview of Snowflake Intelligence, an agentic-style AI-guided interface. Snowflake’s demo of this capability showed it using managed connections to internal and external resources (e.g., databases, stages, your Gmail account, and “up-to-date, relevant and trusted external sources such as Stack Overflow, The Associated Press, USA TODAY Network and more”) to answer natural language questions. In their demo, based on data for a fictional festival, Snowflake Intelligence performed an investigative-style analysis, pursuing multiple avenues of explanation for a spike in ticket sales in Europe in March. It seemed to develop hypotheses and test them, visualize outputs, and even take agentic action, writing and sending an email about what it had uncovered.
This capability utilizes semantic views, which can be thought of as similar to a semantic layer, such as those found in tools like Looker or AtScale. These schema-level objects provide context for the data model, including appropriate joins, translation from business context to data, and approved queries for identifying the source of truth. All of the examples they gave showed the semantic views being used to underpin AI functionality, querying the data, however, as opposed to a semantic layer that would facilitate smarter connection to BI tools – we’re excited to see what other uses these semantic views will have, if any.
However, we want to set realistic expectations for our clients. While Snowflake Intelligence is likely to enter public preview soon, based on past release timelines, such as those of DocumentAI and Snowflake Arctic, it may be as much as 18-24 months before general availability across all clouds and regions. We also expect the initial pricing to reflect the computational complexity of the demonstrated capabilities. Organizations that wish to maximize readiness for adoption should plan to invest resources in data cataloging and curation of semantic views.
Looking Ahead
Ramaswamy’s observation that “we all still have too many silos out there, and when data is siloed, it’s more difficult to make better decisions” captures why these innovations matter. The convergence of simplified pricing, enhanced developer tools, native AI capabilities, and robust governance creates unprecedented opportunities for data-driven organizations.
As we help our clients navigate these exciting developments, we’re reminded that success lies not just in adopting new technology, but in ensuring “the right people […] using the right data for the right purpose,” as Ramaswamy noted. And as he concluded, “AI takes it one step further”—but only when built on a foundation of trusted, well-governed data.
Stay tuned for more blog posts showing how we foresee Snowflake customers using some of the new and exciting features and functions, with a blend of tips and tricks for developers and big ideas for data leaders.




Services provided

