Six principles of effective analytics

Analytics is an incredibly easy function to get wrong. When analytics teams fail, the collapse is usually so spectacular that the debris zone impacts the entire company, including the product teams and executive management. Very often, an analytics team’s singular focus on tools and reporting — the artifacts of their work that serve as receipts for time worked — cheats it of the opportunity to build a scaleable, efficient, and productive workflow.

Having led and worked closely with analytics teams, I’ve arrived at a core set of principles that I’ve seen exist commonly in the workflow of effective analytics teams. Some of these are tractable operational attributes of a team that can be influenced by a lead, and others are factors that must exist within an organization in order to manifest in an analytics team’s daily work.

My experience has been that life is difficult for an analytics team when these principles can’t be enacted; it’s as important for an analytics lead to strive to implement these standards into the foundation of their work as it is for a prospective analytics lead to gauge a company’s ability to allow them to do so.

Note that my definition of “analytics team” here is expansive: I include data scientists, business analysts, and data engineers as part of a broadly-defined “analytics team,” and these people might be embedded within product teams or organized into a central analytics unit that serves the entire organization. The specific configuration of a team or company doesn’t change the applicability of the principles.

Principle 1: The Word of God

The purpose of an analytics team is twofold: 1) to deliver insights to the organization, and 2) to align the entire organization around the veracity of those insights. Simply delivering dashboards and access to data doesn’t satisfy the minimum obligation of an analytics team: metrics don’t mean anything unless everyone — the entire company — is convinced that they are correct. Analytics teams can’t just deliver metrics, they must deliver the Word of God.

I consider any given metric to be the Word of God if, given complete access to the underlying data set that produced it, it couldn’t be derived to any other value using the same business logic. Note that this is different than saying that the analytics team owns all business logic and makes unilateral decisions around important business metrics like defining LTV or churn. Those decisions need to be made by management or the product team in conjunction with the analytics team.

The Word of God means that there is one unified data set from which to derive metrics, and that data set could never produce any other value. Where I see teams have problems in producing Word of God metrics is in the use of multiple external tools that serve different purposes but overlap in the data that they track. Especially when “analytics” is made up of a patchwork of different third-party tools that isn’t consolidated into a central data warehouse (this usually happens in early-stage companies), internal teams can collide over very basic metrics values as they draw from different data sets.

Arguing over the definition of a metric is a productive conversation; arguing over which underlying data is actually correct to use in calculating a metric is wholly unproductive. If the entire organization isn’t aligned around the same data set, then teams can’t even get to a point where they can agree to disagree.

Principle 2: “What?” and “Is?” questions

I consider an effective report to be the clearest possible representation of data that answers one “What?” or “Is?” question. The answers to “What?” or “Is?” questions should describe some provable and observable state of the product: eg. What is DAU? Is revenue going up?

I call the “What?” or “Is?” question that a report is designed to answer its directive, and any given report should only address one directive. A “report” here might be a simple chart, or it might be a combination of visualizations, but it’s important that a report only speak to one directive: combining directives and representations of data makes a report difficult to interpret.

What’s more, the creation of a report should always be motivated by a question. Many analytics teams default to generating reports simply because they want to prove that they are working, or because people ask for a broad set of data points to “see what they can see.” Data for the sake of data is almost never constructive: at best, the visualizations that are presented without prompting get ignored, and at worst, they distract teams from their core product metrics.

Principle 3: “Why?” and “How?” questions

Analysis is the collective work undertaken that most comprehensively answers one “Why?” or “How?” question. The answers to “Why?” and “How?” questions should present theories, supported by data, around why the state of a product has changed, eg. Why is DAU decreasing?

Analysis shouldn’t be effected through a report — primarily because it almost never can be. Analysis can be partially templated and re-used, but the underlying data used to produce analysis should always be vetted and curated on a one-off basis. The delivery mechanism for an analysis might be a set of charts, or it might be a PowerPoint presentation, or it might be an Excel file — whatever the case, it should address one “Why?” or “How?” question as comprehensibly as possible, with care taken to explain nuance and call out assumptions.

The key here is that an analysis is complete: it responds to a broad, potentially nebulous question with the requisite background and commentary to either answer it or prompt more analysis. This is impossible to do with a report; a report should be clear and equivocal, whereas an analysis needs to be convincing. The questions that analyses answer typically don’t have right or wrong answers, or at least right or wrong answers that are knowable.

Principle 4: Do No Harm

One of the most pernicious things that an analyst can do is to present unsolicited analysis, absent context or need, to a product team in an attempt to help them better understand some previously unexplored aspect of user behavior. I call these IEDs (Irrelevant Explanatory Data), and they usually take the form of some sort of overwrought predictive model that attempts to draw together a variety disparate data points into a fanciful explanation of why certain things are happening in the product (“The weather in Chicago was snowy last week, which led to higher than average bus ridership, which led to a +5% increase in mobile sales!”).

Analytics teams must Do No Harm: they should not lead product teams down rabbit holes because they are bored or want to showcase their SciPy skills. Product teams in a state of distress can get desperate and might appeal to the false idols of 1) unwarranted data science and 2) exogenous factors to explain some problem away (to counter: 1) KISS prevails in analysis; never default to a fancy model when Excel will suffice and 2) it’s always your fault).

Not only do these IEDs take the analyst away from more worthwhile tasks, they can divert the product (or any other function) team’s attention away from pressing matters at hand with “solutions” that can’t be implemented. This isn’t to say that analysts can’t answer important questions independent of the relevant functional team; they just shouldn’t. It’s always faster and more productive for analysts and analytics teams to coordinate on analysis with business owners so that the full context of a situation, and the actionable insights that are desired in conducting the analysis, can be understood.

Principle 5: The Customer is Sometimes Right

The inverse of Do No Harm is The Customer is Sometimes Right. Product teams don’t always know what they want, and in such cases they’ll generally default to asking for everything: every possible metric stuffed into a dashboard so that they can eyeball trends, or, more commonly, an open-ended report template that they can use to “slice-and-dice” data until they stumble upon a critical epiphany about the business.

But “kitchen sink dashboards” and open-ended reporting templates are not helpful: they get used once or twice and then forgotten about because it’s incredibly rare to derive insight from data absent a hypothesis. This is why reports need to answer discrete questions: if people don’t know what they’re looking for, they tend to want to see everything, and it’s actually not very common for someone to be able to parse meaning out of a mishmash of visualizations like Neo scanning the Matrix.

Even worse than requests for reporting that are of dubious value are endless sequences of analysis requests that are clearly undirected. Aimless analysis almost always begets more aimless analysis: the worst type of rabbit hole for an analyst to get sucked into is when a team lead sees some behavior that confounds them and, rather than conducting a cursory survey of existing reports using their own intuition as a starting point, enlists an analyst to explain the situation. This is like someone forgetting where they left their keys and, rather than taking two minutes to retrace their steps, asking a blind friend to find them.

If analytics teams can’t refuse these types of ad hoc, frivolous requests, they’ll get bogged down serving them, which does no favors to anyone. An analytics team can’t be effective unless they can resist appeals for analysis that aren’t at least seeded with some sort of intuition.

Principle 6: Eyes Forward

Almost as a corollary to The Customer is Sometimes Right, an analytics team must have the authority to help the broader organization understand which metrics matter in driving revenue and growth. Many KPIs are useless at helping companies understand where their business is going: vanity metrics and simple descriptive data can’t help a business make demanding, imperative directional decisions.

Analytics teams should have input into the metrics that are used daily and also in significant, regular business reviews to determine any changes that need to be made to strategy. The analytics team should be tasked with deriving forward-looking metrics and not just retrospective trivia: how is what is happening right now going to impact the business next month, next quarter, etc.? Part of doing this involves understanding growth, eg. are new cohorts as retentive as past cohorts, meaning DAU will increase over time? Is the cumulative monetization curve flattening over time? Is the effectiveness of advertising spend decreasing over time?

Dashboards shouldn’t be archaeological exhibits: it’s important that any report help a functional team make a forward-looking decision. Understanding the relevant KPIs that communicate growth is best done in coordination with the analytics team, which generally has the clearest and most comprehensive field of vision around advancement towards the company’s goals.

Photo by Giorgio Parravicini on Unsplash