Not long ago ZoomInfo hosted their first-ever Datafest — a 2.5-day event dedicated to the planning, development, iteration, and creation of analytic products. Think hack-a-thon, but with the objective to deliver a working application or micro-service that could be productionalized and deployed across the organization. We had 25 people split into teams, competing for the grand prize of “Datafest Champion.”
It was amazing to see the creativity possible in less than a week’s worth of dedicated effort. Equally fascinating was the focus on actionable insights inherent in each group’s product. I’ve worked on thousands of projects in my career, very few of which have lasted only 2.5 days. Even fewer (I hate to admit), despite the weeks or months of time I’ve put into them, have addressed an action or informed a decision in a direct way. How can this be?
I’ve worked at companies that love data, that preach data-driven decision making, and that invest heavily in business intelligence and analytics teams. I’ve spent my entire career buying data, cleansing data, combining data, merging data, moving data from one spot to another, presenting data, and modeling data. So how is it that in the 10+ years I’ve spent in and around data I haven’t produced as useful a product as the teams at ZoomInfo were able to produce in 2.5 days?
For most of my career, I lacked appreciation for the equation above. Data does not equal information; information does not equal insight. And because anyone reading this (hopefully) appreciates the joke: by the transitive property, data does not equal insight.
Every single team participating in Datafest had one objective: create an analytic product that is instantly actionable. Not a report. Not a static dashboard. Every team needed to create an analytic product that one of our business functions (Sales, Finance, Marketing, Customer Success, Human Resources) could either run a campaign against or instantly make a decision on. With the simple objective of action, all teams were able to accomplish in 2.5 days what I have spent 10+ years trying to achieve: actionable, insightful, meaningful analytic products that our customers (stakeholders) actually want to adopt.
The BI, Analytics, and Data Engineering teams at ZoomInfo were built with the goal of creating meaningful analytic products that get adopted by business partners. With and because of that objective, the key learnings from Datafest mirror recommendations I would give to anyone building out similar groups:
A successful analytics team can’t just be good at coding, or data science, or visualization, or quantitative thinking. These are all necessary conditions, but they are not sufficient for an impactful organization. A successful analytics team must also be good at gathering requirements, defining scope, managing expectations, marketing and roll-out, training end-users, and ultimately driving adoption of what is being built. In short, a successful analytics team must be good at defining, developing, and going to market with their analytic product(s). To build meaningful things that get adopted, we must all become (analytic) product managers.
I feel so strongly about this that I call it a theorem: Data /= Information /= Insights. Its corollary is also true: More Data /= Better Product. More data for specific use cases, such as training a machine learning model, is good. However, no matter how long you stare at a report, insights will not magically appear. For data to be meaningful, it needs to be turned into information: aggregated, grouped, or summed. For information to be meaningful, it needs to be turned into insight: a specific recommendation or action to take. To create meaningful analytic products that actually get adopted, we must appreciate that action needs more than just data.
There is no simpler way to say this — the tools and technologies you use as an analytics team matter, and they matter a lot. In addition to purposefully building out a team where we all consider ourselves product managers and appreciate that action needs more than data, the reason we were able to build in 2.5 days what I’ve spent 10+ years trying to achieve is because, for the first time in my career, we are using the right tools to do so. Each tool we use was chosen carefully and with intention. We ran proofs of concepts against competitive options to solve specific use case(s) and enable the team and the capabilities we build to scale. Our analytic tech stack consists of the following:
Fivetran is a fully managed ELT service that complements our data engineering team with native integrations and connectors to many of our execution applications. Fivetran manages our data pipelines for Salesforce, Marketo, JIRA, and Eloqua, among others. This allows our data engineering team to allocate more resources to higher-impact data modelling, de-normalization and custom engineering jobs, and less on data pipeline maintenance, QC and debugging.
We use Snowflake as our cloud data platform. Snowflake is a cloud-native data warehouse that is the foundation of our tech stack. With the ability to instantaneously scale compute resources, manage multiple virtual warehouses to eliminate concurrency issues and, most importantly for us, natively query semi-structured data, Snowflake allows our BI team to spend more time on actionable insights versus waiting for queries to complete. We would not be an enterprise-wide analytics function without Snowflake as our computational core.
Finally, we leverage Looker as our visualization and actionable-insight application. Looker is the tool through which we govern and define KPIs, expose curated data and insight experiences to our business partners, and enable data informed decision making. While very few people at ZoomInfo interact directly with Snowflake, 95% of our business partners engage with data — whether in a report, dashboard, or model — through a defined Looker experience.
At the conclusion of Datafest, two of the most promising products resulting from the event were Looker integrations:
Recommend sales call tactics
Using a Natural Language Processing (NLP) model built on our sales call and activity data, we visualize word and phrase patterns that correlate with previously churned and renewed customer accounts. To drive action, we recommend conversation topics for upcoming customer renewal conversations based on account meta-data and historical success of call topics. NLP modeling is done in Python and Databricks, input data is extracted, loaded into and transformed in Snowflake, and both word/phrase visualization and triggered alerts/recommendations are developed and exposed to our sales team in Looker dashboards and are embedded directly in Salesforce account objects.
Looker + Slack integration (our Datafest winner)
This is now a native functionality of Looker, but was not at the time of our Datafest. We developed a Slack integration for Looker, whereby an end user can call a Looker dashboard directly through a command prompt in Slack. This gives our business partners instant access to useful insights and informs decision making with low latency.
Building analytic products that will be adopted by business partners should be a goal of any analytics team. It can be done, and done quickly — as our Datafest examples prove — if approached with purpose, clarity, and vision. Recognizing we are all (analytic) product managers ensures that proper requirements are gathered, project scope is defined, and the solution being built serves an actual need. Focusing on action protects against data void of value or insights, and having the right tools removes technology as a roadblock. With our modern tech stack, we are no longer limited by technological capabilities. We are now resource and capacity constrained by the demand for our products — an exciting problem to have for a growing analytics team.