Skip to main content

The Dashboard You Built That Nobody Opens

There’s a hard truth hiding in your analytics platform. Let me show you how to find it.


Open your BI tool. Look at the list of dashboards. Find the one that took you — or someone on your team — two weeks to build. The one with the carefully color-coded KPI tiles, the year-over-year comparisons, the trend lines going back 18 months.

Now look at the last time it was opened.

If it was more than three weeks ago, you’re not alone. You’re in the majority.


The Pattern Nobody Talks About
#

The analytics industry is built on a quiet fiction: that dashboards get used.

Vendors show you demos of executives making snap decisions in front of real-time data walls. Conference talks describe “data-driven cultures” where everyone from the CEO to the customer support rep checks their metrics every morning. Hiring decks promise that this analyst hire will “transform the way we use data.”

And then, in the real world, the dashboard you spent two weeks on gets opened once — at the presentation where you launched it — and then sits there, slowly aging, like milk nobody noticed was left on the counter.

This isn’t a you problem. It’s a structural one.


Why Dashboards Die
#

1. They were built to show what exists, not to answer what matters
#

Most dashboards are shaped by what data is available, not by what a decision-maker actually needs. You have a users table, a transactions table, an events table — so you build a dashboard that shows everything in those tables. The metrics feel important because they’re measurable.

But measurability is not the same as relevance.

Benn Stancil, one of the most incisive writers on analytics culture, puts it directly: most users don’t want complicated analysis. They want to know what is happening. A Gong customer success team wants to know what their customers are doing this week — not an AI-generated “health score” that blends fifteen signals into a composite number nobody can explain.

The more abstracted a dashboard is from a specific question, the faster it gets abandoned.

The fix: Before you open your BI tool, write one sentence: “This dashboard will help [person] decide [thing] by showing them [specific metric] in context of [baseline or target].” If you can’t write that sentence, don’t build the dashboard.

2. They’re used for theater, not decisions
#

Here’s a harder truth: many dashboards were never meant to be used regularly. They were built to exist — to signal that a team is data-driven, to satisfy a stakeholder who asked for “a dashboard on this,” or to provide political cover for a decision that was already made.

When executives need data to justify a choice they’ve already committed to emotionally, they use the dashboard once (at the meeting where they present the decision) and never again. The dashboard served its purpose. That purpose was theater.

This is not cynical — it’s human. Leaders under pressure need ways to make difficult calls feel objective. Data analysis provides that cover. The problem is that teams spend real time building dashboards for this use case, when what was actually needed was a one-page summary for a single meeting.

The fix: When someone asks for a “dashboard,” ask what decision it will support and when. If the answer is “the executive presentation next Thursday,” build a focused one-pager, not a permanent dashboard.

3. They’re too noisy to read quickly
#

The Geckoboard research on dashboard design identifies the deepest UX problem: every piece of non-data visual noise — decorative elements, unnecessary gridlines, redundant labels, too many metrics — degrades the signal.

The statistician Edward Tufte called this the data-ink ratio: the fraction of your visualization that actually communicates data versus ink that just exists. Dashboards with bad data-ink ratios force cognitive work. After a few sessions of effort, most people give up.

There’s also the problem of missing context. A number without a comparison is nearly useless. “42 leads today” tells you nothing. “42 leads today, versus a 7-day average of 38 and a monthly target of 45” tells you something. Most dashboards show the number. They skip the context.

The fix: Audit your dashboard. Remove every element that doesn’t directly communicate data. For every metric, add a comparison (yesterday, last week, target, average). If a number can’t justify its context, reconsider whether it belongs.

4. The builders and the users are different people
#

Analytics teams build dashboards. Business teams use them. These two groups have fundamentally different mental models of what a dashboard should do.

Analysts are trained to value rigor, completeness, and nuance. They build dashboards that reflect this training: comprehensive, carefully labeled, with drill-down capability and filters for every dimension.

Business users want answers in seconds. They want to glance at a screen and know if things are okay or not. They don’t want to apply three filters before they can see the number they need.

Katie Bauer’s framing of analysts as explorers is useful here. Most dashboard work is scouting — routine maintenance, answering basic operational questions. But scouts too often design their reports as if they’re presenting a grand discovery. The mismatch creates dashboards that feel overwhelming to navigate.

The fix: Get a non-analyst to use your dashboard without instructions. Watch where they get stuck. Ask what question they came in with. Redesign around what they actually did, not what you hoped they’d do.


The Contradiction at the Heart of Analytics
#

Here’s where it gets uncomfortable.

Every practitioner who has been in this industry long enough arrives at a version of the same observation: the dashboards don’t get used, the insights are rare, and the decisions are mostly emotional anyway. Data provides the post-hoc rationalization, not the input.

Stancil asked his audience: In your entire career, how often did you find a truly meaningful insight in your data? The average answer was once every two years.

And yet — companies keep hiring data teams. The BI industry keeps growing. More dashboards get built.

The explanation is uncomfortable but probably correct: the value of a data team isn’t always in the outputs they produce. It’s in the belief that data-driven decisions are being made. As long as that belief is maintained, the system is funded. The dashboards may never be opened — but they have to exist.

This doesn’t mean your work doesn’t matter. It means you should be clear-eyed about which dashboards you’re building for genuine use versus which ones are institutional theater. Build the former well. Build the latter efficiently.


What Actually Works
#

After synthesizing what practitioners across the industry have learned, here’s what produces dashboards that actually get used:

Monitoring, not investigation, as the design goal. A dashboard should answer “is everything okay?” in under 10 seconds. Investigation (why is something not okay?) requires a different tool: ad-hoc analysis, a notebook, a conversation.

One question, answered well. A dashboard with five metrics you’ve chosen carefully and provided context for is worth more than a dashboard with fifty metrics. Resist the pressure to include everything.

Feedback loops built in. Ask the people you built the dashboard for what they look at, what they never look at, and whether the dashboard has changed how they work. Build this conversation into your process.

Rewards for boring, reliable work. The most valuable thing an analytics team does is maintain accurate, consistent, trusted reporting — the kind where everyone agrees on what the numbers mean and nobody questions whether the pipeline is broken. This work is low-glamour and high-value. Make it visible.

Natural language as a complement, not a replacement. The new generation of conversational analytics tools (ask-your-data interfaces built on LLMs) reduce the friction of getting an answer from data. They won’t replace dashboards — the ambiguity of human language and the complexity of business logic mean that static, trusted views still have a role. But they can handle the “I just want to check one thing” use case that clutters most dashboards with filters and drill-downs.


The Honest Ending
#

The dashboard you built that nobody opens isn’t a failure of craft. It might be a perfectly designed dashboard. The problem is almost certainly upstream: a misalignment between what was built and what was actually needed, a context where the data was needed for theater rather than decisions, or an organization that hasn’t yet built the culture of trust that makes dashboards worth opening.

The most important thing you can do isn’t redesign the dashboard. It’s get closer to the decisions.

Find out what questions people actually have before they go into important meetings. Find out what numbers make them nervous. Find out what they’re checking in Excel because they don’t trust the BI tool. Build toward those needs.

The dashboards that get used every day aren’t the impressive ones. They’re the ones that answer exactly one question, reliably, in under 10 seconds.

Start there.


Sources and Further Reading
#

  1. Geckoboard — Effective Dashboard Design: A Step-by-Step Guide (2023) — geckoboard.com
  2. Benn Stancil — The Insight Industrial Complex (Feb 2023) — benn.substack.com
  3. Benn Stancil — Disband the Analytics Team (Mar 2024) — benn.substack.com
  4. Benn Stancil — Searching for Insight (Nov 2024) — benn.substack.com
  5. Benn Stancil — Does Data Make Us Cowards? (Nov 2021) — benn.substack.com
  6. Benn Stancil — Go Crazy, Folks, Go Crazy (Feb 2026) — benn.substack.com
  7. Katie Bauer — Analysts Are Explorers (Jul 2022) — wrongbutuseful.substack.com
  8. Michal Szudejko — Natural Language Visualization and the Future of Data Analysis (Nov 2025) — towardsdatascience.com