Data First. It sounds so logical that data should be the foundation and driver for most organizations. But with eighty percent of the world’s data sitting behind firewalls, in disparate systems and organizational silos, breaking down the proverbial wall and unlocking the data to be able to access, query and visualize it anytime and anywhere means adoption of a new approach – a disruption (in a good way).
That shift to recognizing the importance of turning data into a usable and sustainable driver for business value was at the forefront of the Snowflake World Summit in Chicago, Illinois. Emphasized often was Snowflake’s mission to change the way organizations interact with and manage data in their data warehouse built for the cloud. Key phrase being “built for the cloud” which is differentiating from other players in the space who have found ways to offer their on-prem data warehouses in a cloud environment. ie: Not truly native to cloud.
In his keynote presentation, Snowflake VP of Product, Christian Kleinerman shared, “We've evolved with our partners and customers into a data platform for the cloud. I want to emphasize the data part of being a data platform. Everything is about turning data into value.”
Whether migrating from an existing enterprise data warehouse or building a data lake or analytical sandbox, Snowflake's core architecture supports modern data applications that require a large amount of data that can be scaled when needed - all in a database available as a managed service. Simply put, organizations can focus on accessing and using data in a shared data system with a cloud agnostic layer with capabilities for object storage, multi-cluster computing, and scale-out services.
“Slow performance and agility, high complexity and a lack of centralized management are some of the top reasons why traditional analytical solutions fail,” said Ali Sajanlal, Head of Clarity Insight’s Snowflake COE. “Leveraging Snowflake, we’ve helped companies transition from the limitation, hassle and expense of an on-premise solution to the flexibility and ease of use of a cloud-built data warehouse which can evolve with their data needs of the future but enables them to start deriving real-time insights today.”
Shifting focus from infrastructure to Data
The concept of data warehousing isn’t new – they’ve just historically been confined to the existing hardware, computing and storage limitations of an on-prem system. Overseen by infrastructure administrators and maintained by DBAs who were skilled in accessing and managing the data living inside while ensuring workloads and computing resources were not overly taxed, this model worked… but to what end?
Today, the focus isn’t on management of the DW itself, but rather on what it is enabling. It’s no longer a question of “when do we have the resources to run an ETL?” (likely 2:00am) but instead, “when do I need it to run based on my needs?” The shift is back on leveraging the data housed in the DW for the needs of the business and the value derived from it.
Traditional data warehouses were leaving organizations struggling to quickly deliver data in real-time or at scale, but the expectations for organizations to leverage data for business value was, and still is, increasing. This evolution of business intelligence solutions and modern data warehouses allow users to deliver insights to data analysts and business users in real time and create a potentially self-service platform for any data professional to access data and insights for their needs.
As enablers of data-driven insights for Fortune 500 organizations, the momentum we’re seeing at Clarity around Snowflake and their partner ecosystem for data quality governance and business analytics is real and building, and we’re expecting that to continue.