Business Analytics That Actually Change Decisions

June 20, 2024

Business Analytics That Actually Change Decisions

Early in my analytics career, I built what I thought was a great analysis. I had clean data, proper statistical methods, well-formatted visualizations, and a clear narrative. I presented it to the team lead. She nodded, said "interesting," and nothing changed.

That happened enough times that I started asking the right question: not "is the analysis correct?" but "is this analysis connected to a decision someone is actually trying to make?"

The answer was usually no.

Start with the decision, not the data

The instinct for analysts is to look at what data is available and find interesting things in it. That produces interesting things — but not necessarily useful ones.

The right starting point is: what decision does someone need to make, and by when?

Before I start any analysis now, I have a conversation with the stakeholder that covers:

  • What action will you take based on this analysis?
  • What would change your mind from Option A to Option B?
  • When do you need this, and what format is most useful?
  • Who else needs to see it, and what do they care about?

A 30-minute conversation at the start eliminates 80% of "that's interesting but not what I needed" feedback at the end.

Frame findings as recommendations, not observations

There's a difference between these two outputs:

Observation: "Campaign click-through rates dropped 18% in Q3 compared to Q2."

Recommendation: "Campaign click-through rates dropped 18% in Q3. The drop is concentrated in mobile users aged 25-34 and correlates with a creative refresh in Week 7. I recommend reverting the mobile creative for that segment and A/B testing the new creative before full rollout — estimated recovery of 12-15% CTR based on historical patterns."

The first one is accurate. The second one is useful. Most stakeholders will read the second and act. Most will file the first and forget it.

I force myself to answer "so what?" for every finding before I present it. If I can't answer it, the finding isn't ready.

Know your audience's native language

A VP of Marketing thinks in campaigns, reach, and revenue. A VP of Operations thinks in utilization rates, cycle time, and cost per unit. A CFO thinks in margin, EBITDA, and cash flow.

If you present a technically correct but context-wrong analysis, you've failed regardless of the math.

At Genesis Motor America, I learned early that our VP-level stakeholders needed to see findings in the context of campaign budget impact — not in terms of statistical significance or model accuracy. "The attribution model shows paid search is 2.3x more efficient per conversion than display" is useful. "Our XGBoost model has a 0.87 AUC" is not, to that audience.

Translate the technical output into business language before it leaves your desk.

Build the simplest thing that answers the question

There's a strong temptation in analytics to build the most sophisticated model for every problem. Resist it.

If a stakeholder needs to know whether Region A outperforms Region B, a two-column comparison with a significance test is the right tool. A clustering model with 12 segments is not.

I've shipped two-line SQL answers that drove six-figure budget decisions. I've also seen three-month machine learning projects shelved because nobody understood the output. Sophistication has a cost: it takes longer to build, longer to explain, and longer to earn trust.

Use the minimum complexity that answers the actual question. Add complexity only when simpler approaches genuinely can't get there.

Make dashboards decisions-ready, not data-dumps

Most dashboards I inherit are data graveyards. Every metric the system tracks, visualized in a grid, with no hierarchy, no narrative, and no clear call to action.

The dashboards I build for decision-makers follow a different structure:

  1. Executive summary row: 3-4 KPIs with trend direction and RAG (red/amber/green) status
  2. Anomaly callouts: What changed significantly since the last period?
  3. Drill-down layers: For stakeholders who want to investigate
  4. Data freshness indicator: When was this last updated?

The goal is that a stakeholder who opens the dashboard on Monday morning can answer "should I be worried about anything?" in 30 seconds. If it takes 10 minutes to form that judgment, the dashboard isn't doing its job.

Analytics earns trust through accuracy on small things

The fastest way to lose stakeholder trust is to have one number wrong in a deck that goes to a VP. It doesn't matter if the other 47 numbers were right. That one wrong number becomes the story.

I build in at least one sanity check for every output: find a number the stakeholder already knows (last quarter's total revenue, the headcount they remember approving) and make sure it matches. If it does, every other number inherits that credibility.

If it doesn't match — dig in before you present, not after.

The goal is a better decision, not a better chart

Analytics is a means, not an end. The best analysis I've ever done wasn't technically impressive. It was a two-page memo that reframed a budget decision the team had been stuck on for weeks, backed by three well-chosen data points that everyone immediately recognized as true.

It changed the decision. That's the whole job.

GitHub
LinkedIn
CV