Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Lessons learned from implementing user-facing analytics / dashboards?
171 points by hjkm on Sept 28, 2023 | hide | past | favorite | 105 comments
We're currently writing an article about this.

If you'd be up for sharing some lessons / takeaways / challenges here, or even better, having a chat (I'll reach out) that would be amazing.

We'll of course fully attribute learnings / quotes etc.




I've worked in many analytics projects across a number of companies as a consultant. I'm a big believer in "decision support systems". Find out what decisions your customers need to make, repeatedly, to their job. Quantify the heuristics and visualize that information (and that information only) in an easy to consume manner. More often than not that's an email or PDF. Another advantage is that by supporting the business users they feel less threatened by the changes or technology.

I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they'll derive "insights". That's not how normal people or data work. We just had a discussion on HN the other day about Facebook's Prophet, and its pitfalls. Meanwhile we expect Joe in sales to be able to identify useful trends on a chart he made. Every company needs to forecast, regardless of their sophistication. That stuff needs to be defined by the right people and given to the users.


I helped build the analytics group at a PE fund, and this really fits with my experience.

Good decision support is where most of the value is, and it’s about building things that draw conclusions, not just throwing the data over the fence with 50 filters and expecting the end consumer to do the actual analysis.

I now work on an open source, code-based BI tool called Evidence, which incorporates a lot of these ideas, and might be of interest to people in this thread.

https://github.com/evidence-dev/evidence

Previous discussions on HN:

https://news.ycombinator.com/item?id=28304781 - 91 comments

https://news.ycombinator.com/item?id=35645464 - 97 comments


Agree with both of you, and would add that knowing who is using the system, and what they need to get out of it is really the key to making them shine.

Too many systems have too much data for too many customer categories and end up being useless to everybody.


This is it, really. I remember back during a previous section of my career when I was running BI for a manufacturing company. We were asked to web-ify some legacy reports that either ran on desktops using Access & Excel or were on older BI products (Cognos). It was shocking -- at the time (I was naive) -- how many business requirements were essentially "replicate Excel in a browser", and completely divorced from the actual business processes and decisions that needed to be made.

Also, it might surprise a lot of less experienced developers just how many reporting tools are actually pieces of a workflow, not just reports. If you sniff this out during the requirements phase, do your best to convert these reports into features of an actual workflow app/system rather than allow them to persist as standalone reports.


>either ran on desktops using Access & Excel or were on older BI products (Cognos). It was shocking -- at the time (I was naive)

I think some people have a skewed view if they do most of their work with VC funded/SV companies. The average person at these companies is way more data savvy than average.

But there are so many companies out there that make a ton of money and have data-unsophisticated-but-domain-wise users, and old systems. Low hanging fruit.


I agree that the term "self-service analytics" (especially the 'analytics' part) and "insights" just passes the wrong image of the real need of business users out there. It mixes 'strategic insights' with 'operational needs'. And I think self-service needs to be about operationalizing data. Sales managers are not necessarily looking to 'analyze' data or 'get an insight'. They need answers from data to manage their team. They need to track well-defined KPIs. See how their salespeople are doing and be able to have a productive meeting to tell them what they are neglecting. Customer success people need to "pull some data real quick" on the usage of the product by a certain client before a meeting.

These things happen all the time. And yet most companies out there think that the solution is to just build a bunch of dashboards, foreseeing what everyone will ask in the future. And then nobody checks the dashboards. Or finds the right one. And then they have a team of SQL translators pulling data for ad-hoc questions. That's silly IMO.

I'm obviously biased as a founder of a self-service analytics company based on AI (https://www.veezoo.com). But this is just my 2 cents on a topic I really care about.


> I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they'll derive "insights".

In my experience, what "self-serve" really means is "non-developer". The end user won't build it, they'll have a BA do it. But it does mean they don't need IT to help.


Have you ever had a decision maker who struggles to articulate what business decisions they want to improve? How do you handle that?

I’ve heard pretty high-level managers respond to that question with things like “we were hoping your data would tell us” in response and I’m not sure what to make of it.


>Have you ever had a decision maker who struggles to articulate what business decisions they want to improve? How do you handle that?

Hah, 90% of the time. I think a big part at being good at this job is being able to coerce that information from people.

You need a process of drilling down, kind of like the 5 Whys[0]. You want to make more profits, right? That means we need to either increase revenues or decrease costs. Are we measuring all these things (you'd be surprised at the number of seemingly successful companies who can't)? Okay, how do we affect revenue? By increasing the number of users or increasing the revenue per user. Are we measuring those things? And on and on. It's a perfect way to iterate, and as the company matures it can be infinitely more and more sophisticated. For lower level people, sometimes it means sitting there and watching them do their job.

[0]https://en.wikipedia.org/wiki/Five_whys


This is the right mindset for sure. Most of the time the initial question is very loosely defined, but actually having these conversations with the people who "want data", and helping them structure their thinking is also a hugely rewarding part of working in data and analytics, and will help you advance in your career.

It can be easy to have a cynical view of what people are asking for, but in my experience there is often real value you can uncover.

One thing which helped me a lot is having a decent understanding of accounting and finance. A fun, and fairly quick, way to develop that is by taking a course on financial modelling (in excel). Modelling a business in a spreadsheet is a lot of fun, and it helps you build good intuition on the underlying "physics" of how a business makes money.


Can you recommend a good online Excel modeling course?

I see tons of courses that teach “Excel skills”, but can’t find any that teach modeling using Excel.


Wall Street Prep and Marquee Group are the ones most of the banks and financial institutions use, as far as I recall.

Here's a self study package from wall street prep: https://www.wallstreetprep.com/self-study-programs/premium-p...

https://marqueegroup.ca

Get your employer to pay for it :)


My most successful projects were sending scheduled emails for things found in a cluttered dashboard (or tucked behind a few filters). Happiest customers, most repeat business and biggest impact.


> More often than not that's an email or PDF

> I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they'll derive "insights". That's not how normal people or data work

So well said. It doesn't shock me anymore when someone asks for a succinct summary or a PDF version rather than digging through dashboards on their own. In my company, we have a user-facing analytics product, and we added the option to take a PDF snapshot on a recurring basis and send it via email!


The grander your title, the more likely you are to want a succinct summary of need-to-know data delivered to you in a convenient format. For most executives, something in their email is best; they are there anyway, no logging in to anything.


I'll enjoy watching this thread evolve. Some thoughts from my experience:

- Everyone asks to translate simpler spreadsheets and Excel charts/graphs into dashboards in your BI tool of choice. As soon as it's there, they'll ask you why they can't export to manage the data themselves. This vicious cycle can sometimes be stopped but is a slow-motion drag on productivity in lots of orgs.

- Build in validations, and/or work on ways to check the dashboard. Dashboards sometimes put their builders and consumers on auto-pilot. The dashboard "must be right" but could easily have a bug or inaccuracy for weeks/months/etc. that isn't obvious without some external validation.

- The dashboard never has the "right" metrics - users will continue asking for changes. Be your best advocate and say no as a way of understanding the importance of the ask

- Related: always ask why about everything you're building into or modifying in dashboards. Business users often ask for things without an ounce of rationale.

- Related: taking away is harder than not doing at all!

Finally, I think most dashboards miss one fundamental point. Imagine you're the CEO/COO and you've got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you're seeing? What's the succinct summary?

I like building in spots to write 2-3 sentence executive summaries.


I work closely on BI projects but from a finance perspective. The concept I like to explain to the BI teams is that the dashboard is always just a snapshot of “what” is happening. But the underlying base level data is always needed to understand “why” it’s happening. And without the why, there’s no actual intelligence gain.

Take a metric like Average Order Value (AOV). It may be ; total sales / order quantity. But as that metric is used it’s often being compared to something like last year, last month, or a plan and anyone interested in that number is really interested in understanding the “why” it has changed from some other point in time/scenario.

For that, you actually need to bring in line item details behind orders as each order has multiple products/skus and they likely sold at different prices from a year ago or what was expected in a plan. An analysis of this has a name, price-volume-mix analysis or PVM.

I always seem to have to explain this to BI teams when I join a new company and am seeking data. I’m currently going through it with a BI team, that apparently the BI tool wouldn’t store this information. It’s like it only stores aggregate values so it’s not even possible to get base level data for analysis (without major architectural changes). I don’t know if that’s normal in BI or was an implementation decision at some point but I’ve come across this same thing on a handful of companies and as I said I really have to drive this concept for those teams. When I ask of it I’m usually met with a “why would you need that info / give us a use case”. Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile (eg. Ever build a dashboard that then goes unused? I probably wasn’t perceived as useful for some reason like this).

This could be more concise put as, understanding your end users needs. Understand the difference of what people ask for is often different than what they need. If they ask for AOV metrics, they’re really saying “I need to understand AOV” and that’s done via PVM analysis.


Similarly titled towards finance. I specialize in what I'll call decision analytics for insurance underwriters.

> Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile

And this relates to what I was thinking about in my first comment. I once was conversing the COO of my company (my last job), at a 1000+ person company, and asked him if he thought more concise requests for things would drive productivity. He, point blank, said: "sometimes I don't even know what I'm asking for"

I've remembered that moment for years. In so many situations, the actual BI/dashboard is the least important part of the puzzle. Instead it's all of the conversation and discovery to understand the real need(s)


> He, point blank, said: "sometimes I don't even know what I'm asking for"

Totally relate to that AND I'm often on the receiving end of those questions in a live setting (eg. board/exec meetings). Funny to stumble on this because just last week I told someone on our BI team, there is not any one "use case" I can lay out. The use case is this, assume I need to answer any random question that comes up. I need analytical enablement not a fancy dashboard in most instances. It's not to say dashboards don't have their place, but they're just the easily digestible summary of underlying data that's meant to highlight areas and raise those questions about "why..."


Oh thank you guys! I came to this post with the thinking of "dashboards are hard to build and when they exist it's hard to extend them". My boss is asking me from time to time to add yet another dashboard to grafana and all I can do is add another piece of specific inner working values because lack of traces - the "what" part. Up until now if was as if a car driver is asking for the average air volume intake yet in really he probably needs the "why" the air volume changes. The "probably" part is important here: if the driver is the test-driver then he needs the actual value, the what; is the driver the track performance guy he needs the why; and sometimes they need booth values because they feel more secure with more information. You guys just opened my eyes, thank you!


That’s a fantastic point. The unstated underlying request isn’t to see the same chart, the same way, every week/month/quarter. It’s actually: “Find any abnormalities in the data that could be threats or opportunities, and show me THAT — in a chart, table, email, or whatever medium make makes sense.”


Yes and to say it another way it’s often “tell me the story of something I don’t already know”


"Business users often ask for things without an ounce of rationale."

Deserves extra upvotes just for this statement.

This has always been painful to me working in the data analysis and reporting space. When I get many requests for dashboards or reports that lack an answer to the question of "how will this be used?", I seem to find the requesting groups are cost-centers in the larger organization and are somewhat obsessed with processes and procedures.

This is rarely a good group to build a career with . . .


As a business user myself, often times we are simply responding to an edict from Leadership that amounts to: “You guys should have some dashboards!”

Other times it’s because a supervisor or internal customer likes to see certain things on a chart, and putting the chart in Power BI/Tableau/other tool will make them prettier than Excel charts.

Very few people, starting from the top on down, have a good understanding that dashboards mean very little in and of themselves.


Brilliant summary - mirrors my thoughts and experience quite closely.

Validation/testing has always been a challenge, especially given that dashboards are by definition quite “full stack” implementations where testing just the front end or back end is not sufficient and testing both in isolation can also often be challenging due to the huge possible variations in input data.

Mocking data is also hard because dashboards may also lean a lot on database-side calculations/filtering.

All of this has lead me to take quite a full-fat approach to testing dashboards, by using a real DB populated with test data, and testing the full complete application stack (driven by something like Playwright or Cypress) as well as more granular unit tests where a mocked data layer may be used.

I’m also looking at introducing visual regression tests next time I work on this kind of thing. The visual aspects of dashboards can easily drift over time even if the data is correct. You’re often theming charting libraries for example and the compliance of the theme can drift slightly if you update the library without really checking every detail of the visual appearance/layout every time. Or you may not even notice the “visual drift”…


Interesting - great points

> Validation/testing has always been a challenge, especially given that dashboards are by definition quite “full stack” implementations where testing just the front end or back end is not sufficient and testing both in isolation can also often be challenging due to the huge possible variations in input data.

Constantly evolving but I've always tried hard to keep calculations away from the display tools. So, I put lots of things in SQL SPs, or in Python, or more broadly in tooling that allows me to recreate the summary data without the front-end. My nightmare is having to check a PowerBI calc that itself is based on an underlying SQL calc. Which one is wrong? Now spend twice as long figuring it out!

> The visual aspects of dashboards can easily drift over time even if the data is correct. You’re often theming charting libraries for example and the compliance of the theme can drift slightly if you update the library without really checking every detail of the visual appearance/layout every time. Or you may not even notice the “visual drift”..

Love it, very smart. Why I prefer tables for many things too - one less thing to maintain and check.


PowerBI is a WHOLE other can of fish - I haven't spent long enough with it to figure out how you build a test suite around that - but it sounds tricky!!


I would like to use the Men In Blake eraser pen on my PBI experience. Can’t stand it!


You have my condolences :)


> Finally, I think most dashboards miss one fundamental point. Imagine you're the CEO/COO and you've got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you're seeing? What's the succinct summary?

Having been on both sides of this, I think the challenge is that the CEO/COO's job is to figure out "what should we do about this?," which is the right approach to coming up with that summary (it's not just "here's a text version of the chart"). And the corollary challenge is that, in most cases, non-technical people with domain knowledge are the ones who need to produce the analysis: so any feature incomplete dashboard is going to stymie them and any general framework that requires a technical person to step in for code or configuration is going to slow the process to a crawl.

It's the rule (not the exception) that (especially if things are going poorly) the next step is asking more questions, which involves investigating something else in more detail. A dashboard, however pretty, is as useless as a doorknob if it doesn't have the needed information.

I have found that dashboards per say are always great as the high-level KPI trackers, like the things you would consider hanging on a wall in an office (e.g. "revenue growth this month" or "new customers acquired"). You'll always want to know that information, and many people of unrelated departments need to have that information shared to them.

The other helpful area is a deep-dive domain-specific analytics programs, like for example Google Analytics, where it has a very full featureset for non-technical marketing people to go in and drill down to answer questions. The UI/UX designers of that product have spent years honing and A/B testing which types of graphs to show where, and mapping out how to have people click around to find what they're looking for, to the point it is pretty easy for non-technical people. They even have courses and certifications on how to use the system.

Organizations that try to internally build a feature-complete system like google analytics for a specific domain need to consider it like building an entire software product (even if there's a general low/no-code BI SaaS to assist) because you'll need collaboration between general technical experts and non-technical stakeholders with changing and vague requirements. It can be done, but likely only with years of investment and UI/UX research, just like any other software product that solves a domain problem well. In practice: millions of dollars.

Technologists often forget that Excel *is* a turing complete programming language (and it's a functional programming paradigm too!). If an org is not committed to spend years and millions of dollars on deep dive analytics for a specific domain, the right choice is almost always using a commercial analytics system for that domain that costs less than the internal build, or embracing the trusty spreadsheet.


>I think the challenge is that the CEO/COO's job is to figure out "what should we do about this?

Totally agree. I'd even go a little further and say the business is in trouble if the CEO doesn't know "what we should do about this". It's the CEOs job to know those things, and it's the data team's job to provide the tools to make those decisions easier, faster and better.


> Totally agree. I'd even go a little further and say the business is in trouble if the CEO doesn't know "what we should do about this". It's the CEOs job to know those things, and it's the data team's job to provide the tools to make those decisions easier, faster and better.

I agree with one modification. Also the CEO's job to empower folks to pitch what they think the CEO should know too. I've worked in plenty of successful shops where the CEO's answer to "what should we do" is "I'm not sure yet - what do you think?" - and that is a golden opportunity to show your chops if you've got the opportunity


People frequently overestimate the role of data in decision-making. Metrics, numbers and other quantitative information don't tell the full story. For a CEO to make decisions, the full picture must include qualitative information - risks, opportunities, market events, competitor's actions, etc. Metrics are just part of the full picture. Far less significant than many BI developers and "data teams" tend to think.


It's my experience that you get easier, faster or better. Choose any 1, rarely 2, never 3.


Lots of great suggestions here, but one I haven't seen is providing deep links. Let users share the exact state of their dashboard with others, ideally without requiring some convoluted system of logging in and sharing things. We implemented it by allowing a json config in the url, then providing a button to copy a shortened URL containing the whole config.

Original creator of (the now woefully dated-looking) GBD Compare [https://vizhub.healthdata.org/gbd-compare/] here, where we found this super useful since we had so many controls that it could take a lot of clicking (and knowledge of the UI) to recreate a graph someone else was looking at. It really helped with reach, as folks could email/tweet their specific view then others could use that as a starting point to dive in without starting from scratch or having to create an account.


+10, yes, don't make people recreate all the filtering for something you're putting in their hands.


To add to this, there are two kinds of sharing.

Sharing the parameters, filters, etc.

Sharing the results.

They can both be very important.


Yeah, deep links are essential. Datadog's really good at this. They also have a nice preview generator so that when you drop that link into Slack it generates a preview image of the dashboard with all of the filters applied. It's great.


* Design matters a lot - if it looks bad, people won't look at it.

* Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters ("let's filter the entire dashboard by just this marketing channel, or just this registration cohort") identical to all breakdowns provided, and finally a row-level drill-down ("show me the users in this particular aggregation"). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.

* Padding matters, font matters, color palette matters, no typos matter, visual hierarchy matters (i.e. big bold numbers versus smaller grey numbers).

* Always define the key metrics first (based on fact tables). All dimensions and drill-downs in the dashboard will derive from these front-and-center stats.

* Reconcile to existing metrics before broadcasting widely - almost always, people have the same stats in extant technologies (i.e. Excel, Mixpanel, Salesforce) and will instantly find inconsistencies between your figures and the extant ones.

* The vast majority of users will be passive viewers. Very few users will be "power" EDA (exploratory data analysis) users. EDA views will look different from the view that passive viewers want - keep them separate

* Obviously, the more things done in code, which promotes modularity and composability, the fewer data integrity issues you will have


> * Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters ("let's filter the entire dashboard by just this marketing channel, or just this registration cohort") identical to all breakdowns provided, and finally a row-level drill-down ("show me the users in this particular aggregation"). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.

Is there any chance you could link an image of what a good version of this looks like?


Sure, I could almost choose any one on Google as they all follow the same template - here is one: https://coderthemes.com/hyper/saas/index.html

1. Six top-level stats jump out at you: customers, orders, revenue, growth %, current week revenue, previous week revenue. All of these stats are adorned with a few substats (smaller text), almost always a % up/down from last period

2. A few large panels with breakdowns: revenue over time, revenue vs projections, revenue by referral source, revenue by location

3. The top right has your filter buttons, and generally it includes every breakdown dimension on the page. For example, "let's look at this dashboard by just the Google referral source" or "let's look at our stats from the U.S. geography only" or "let's filter this for last 2 years only"

4. Drill-down is "top selling products." This isn't truly a drill-down, as it is still an aggregation, so you really want to drill-down to the record-level. If you filter the dashboard for "U.S. sales by the Google referral source for the last 2 years only", people invariably want to see what the actual row-by-row sales were, and that is the drill-down. They can easily export this and reconcile to source systems. As an example, for some of the work I do, sales reps don't just want aggregations about their sales leads, they want the actual names of actual sales leads (as rows) so they can contact them.

So again, four major parts to a dashboard, which really drive from two simpler (likely familiar to most data analysts): metrics and dimensions.


Thanks :)


In my experience, if your plan is to make a “dashboard”, you’re already on the wrong path. It’s too generic and says nothing about what problems you are there to solve. Think about it yourself: in how many of the products that are important in your life is there any meaningful value produced by a dashboard?

Dashboards seem alluring because we imagine that users will sit there and somehow have insights delivered to them automatically. It’s often less clear what those insights will be or what is needed to produce them, we somehow hope they will materialize by just displaying some data. Often the focus is on making pretty-looking charts (which only ever look good when you demo with picturesque fake data), because you want the product to feel colorful, welcoming and visual.

A better approach is to either make a focused tool for solving a specific problem you know users have - you won’t think of what you end up with as a “dashboard” but it might occasionally end up looking a little like one - or to make general tools that allow users to dig through data interactively to find the things they care about.


I think that's a really useful thought exercise. FYI I posted this thread - could we interview you about this for our article? Just 10 mins. I'm at harry[at]embeddable.com.


I've seen so many of these projects over the years, and they are almost always used for success theater, promotions or just plain ego.

- What do you hope to learn from this tool?

- Is there a less expensive way to get this information?

- The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?

- This is not a one-and-done project. The moment some visibility emerges in the fog, you will be desperate for more answers. We must set up a process for the never ending litany of questions that will emerge from this work.

- Smaller is better, incremental, fast iteration and ability to change are all far more important in dashboard work than stable, long term, deeply reliable.

- This is the conversation I even have with myself as I work on data for my own company.


> The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?

It's a feedback system. Feedback is only useful if it can trigger behavior change. How can this measurement change the company's behavior?

Anything else is a vanity metric.


Biggest lesson: all metrics _must_ be defined in code, not manager-speak.

For instance, if a marketing head wants to plot CAC (cost of acquiring customers) over time, saying CAC is number of customers divided by marketing spend is manager-speak. Spends are budgeted higher early in the month and adjusted with actuals. Customers ask for refunds and cancel accounts. Some campaigns have volume incentives which are known later... and so on. The solution is to write well commented SQL which laymen can audit and improve.


And make sure your project plan includes milestones for explicitly aligning all stakeholders on definitions otherwise you'll have a big hot potato game the first time the outputs show something unfavorable.


Or hire a data analyst to spend 3 months standardizing on just 3 definitions of retention!


The biggest help we got was meeting directly with our customers and asking them “What would it take for you to login everyday to view this dashboard” and they clearly provided metrics and trends they care about but have a hard time getting access to the data. Also don’t get fancy with our visuals. Lots of big metric kpi visuals, tabular visuals, line charts & bar charts. They should be able to glance at the visuals and immediately known what’s going on and get sense of what the visual is conveying.

Another thing customers love is the dynamic ability we give them to be able to switch how certain visuals are grouped or what value is being displayed. We can’t for see all the different ways users will want to slice and dice the data so giving them that ability was huge.


This is super interesting. I posted this thread - could we interview you about this for our article? Just 10 mins. I'm at harry[at]embeddable.com.


I'm a software developer.

There is a chicken and the egg problem when it comes to designing these things.

I can ask "What do you want the dashboard to look like" and they'll answer "I don't know before I see the data".

Then I'll ask what data they want to see, and they'll respond "What will it look like?", or we'll spend significant time on data collection only to find they never actually want it in a dashboard after all.

By far and away the most time consuming aspect of this entire domain is to find out what users actually want to see, as they almost never have something specific enough when they approach me.


I guess what you need is researchers, not dashboards.


Biggest finding for us has been that no matter how many charts / filters / options / etc. we give to our users, they always want something more.

Answers don't just lead to Eureka moments, they lead to follow up questions and follow up questions.

Not a complaint - it's actually great. Just an observation (and a challenge)


About a year ago my (new-ish founder) boss came to me and asked me to build him a custom dashboard. "I have all the data in a spreadsheet but I want it in a dashboard" he said. I was a specialized systems dev, only occasionally doing a bit of webdev if necessary and really didn't have time for those kind of errands.

I showed him this tutorial I had recently seen. Just a few minutes and the thumbnail, about how to build a "dashboard" in excel. https://youtu.be/z26zbiGJnd4?si=HWn8qTbozD8vmXiF

"Oh wow, I didn't know excel could look so beautiful!". He asked for the link, never did anything with it of course but was totally satisfied. I am pretty sure he just wanted a shiny toy and also felt inadequate about "just using excel" to do his important founder work. Showing him that excel can look beautiful and is a powerful tool was enough. No more feeling inadequate, no need for an actual (or even excel) dashboard.


I spent 5 years leading a data team which produced reports for hundreds of users.

In our team’s experience, the most important factor in getting engagement from users is including the right context directly within the report - definitions, caveats, annotations, narrative. This pre-empts a lot of questions about the report, but more importantly builds trust in what the data is showing (vs having a user self-serve, nervous that they’re making a decision with bad data - ultimately they’ll reach out to an analyst to get them to do the analysis for them).

The second most important factor was loading speed - we noticed that after around 8 seconds of waiting, business users would disengage with a report, or lose trust in the system presenting the information (“I think it’s broken”). Most often this resulted in people not logging in to look at reports - they were busy with tons of other things, so once they expected reports to take a while to load, they stopped coming back.

The third big finding was giving people data where they already are, in a format they understand. A complicated filter interface would drive our users nuts and turned into many hours of training and technical support. For this reason, we always wanted a simple UI with great mobile support for reports - our users were on the go and could already do most other things on their phones.

We couldn’t achieve these things in BI tools, so for important decisions, we had to move the work to tools that could offer text support, instant report loading, and a familiar and accessible format: PowerPoint, PDF, and email. Of course this is a difficult workflow to automate and maintain, but for us it was crucial to get engagement on the work we were producing, and it worked.

This experience inspired my colleague and I to start an open source BI tool which could achieve these things with a more maintainable, version controlled workflow. The tool is called Evidence (https://evidence.dev) if anyone is interested.


Ironically one of the major uses of analytics has been to highlight the impact of slow response time on user retention for a wide class of applications.

I also feel that speed builds trust, although I don't know specifically why. Perhaps people envision more errors or error-prone processes when a system is slow. It certainly shows more understanding of the data to present it quickly.


Use colours and graphical elements (generated graphs), but:

Obey rules of spacing more carefully than other rules to avoid overwhelming.

Do not use colours unless signalling information, so users can be alert and relaxed when needed.

As soon as you have more than 2 types of information, have expanding panels, which remember whether the user expanded/collapsed them.

Lastly, remember that speed of loading data is much more important for dashboards in general than a random page. Cache data, or initially load only summary data, or only load the latest day by default and then fetch the weeks data. Remember clients may make purchasing decisions based on how fast your stats page of your SaaS usage loads when they are showcasing it to their C-suite, and a 15 second wait can cost you your enterprise sale.


Ah, I saw a great tweet that captured a lot of my feelings about this the other day: https://twitter.com/InsightsMachine/status/17018601232984842...

>“Data is the new oil.” Clive Humby, 2006

>“Most of my career to date seems to involve redesigning legacy reports to make it easier for existing users (if any) to see that they contain absolutely no actionable insight with a lot less effort.” Jeff Weir, 2023

For my perspective:

In general, I find most users can't actually say whether they need any given number/visual on an ongoing basis. So large amounts of work go into building dashboards that are used for a very short amount of time and then discarded. Probably we should do a better job on one-off analyses and only dashboard after the fact.

Many users don't actually want a dashboard, what they actually want is a live data dump into excel where they can pivot table it. Maybe, maybe a bar or line chart.

In general, I find people always ask for more filters, more slicers, just endless options to reconfigure the data as they please. But they quickly become trapped in a swamp of their own making, now nobody knows how this should be sorted or sliced, does it even make sense to do it this way? People think what they want is a 'data democracy' with hundreds of dashboards with hundreds of options with hundreds of users and so they ask for and usually receive it. But they usually just end up coming back to the data team and asking - 'so what's the answer?' What many orgs need is actually a data dictator.

On the other hand, dashboards do allow you to establish really good feedback loops within the business so when you can identify an ongoing constraint, figure out how to track it and then force people to receive it on a regular cadence and be accountable to it, you can make a lot of headway. But that's a more niche use-case than how they're frequently used and the skills involved are different - less visualization skills, more business analysis - and you need to be positioned to make sure someone is held accountable.


For external dashboard not internal:

- You can output the most elegant metrics, you will never know if it was the right one until you talk to actual customers. Most of the time, they don't even understand what is presented.

- Use libraries, ui-kit made for this, it will save a huge amount of time.

- Whatever you do it will: never be enough, wrongly interpreted, used in the wrong context.

- Try to tie graph and metrics to use cases or questions. e.g titling: "Active user" vs "How many users were active* in the last 30days?" (* define active in the description) can make a huge diff in terms of comprehension


Any recommended libraries for making external dashboards?


This one is really good for user facing analytics dashboard https://www.tremor.so/


80% of the time people should display a table, 15% a time-series or line chart. The other 5% is probably wrong. Anyone that asks for pie charts, 3d charts,... isn't a real data user ;)


I once added a speedometer for production rate compared to avg over previous X weeks, as a widget demo. It ended up on every exec’s dashboard and on a big screen.

I’m not sure what they were trying to manage, but it was purdy and looked dashboardy.


Pie charts and 3D charts are used by real data users to lie to customers. (Customers love that shit.)


What would you recommend for demographics analysis if not a pie chart?


There are plenty of options. Treemap, waffle chart, a bar plot.

There's nothing special about demographic data.


Unless you're displaying amount of a pie or pizza that's been consumed so far, don't use a pie chart


Worked at a place providing financial research data and models to investors. We spent a lot of effort creating infinitely flexible and customizable reporting and dashboards. Turns out no one used that. Everyone just wanted a general high level report emailed to them.


This sounds accurate. But of course the dashboards would be one of the biggest selling points in a sales pitch.


Tons of great advice in the comments. At risk of repeating others, here's what I've learned working on business intelligence tools for an engineering group:

- What users ask for and what users really want are often extremely different.

- Engineering executives like to place their "thumbprint" on every business analytics dashboard. They want evidence that the "intelligence" being reported has been customized by them. It's their way of imparting branding on the organization.

- UI/UX is far more important to users than how you handle the technical details. When discussing implementation with them, start with the UI so that they have a mental model to build from.

- Leave space to create cool things that you/your team want to make. The developers of BI dashboards often have excellent ideas for visualizing data that an end-user would not immediately think of. Leave room to "delight".

- Never assume the data is clean or accurate (even when there are regulatory reasons for it to be either of those things)

- Not everyone's opinion is equally valuable.

- Beware of corporate politics. I once had an analytics project completely shut down because it would expose certain weaknesses in the business that were not acceptable to discuss publicly.

Bonus: Read "Envisioning Information" by Edward Tufte.


They always seem easy at first. They're never easy. Anyone can toss up a visualization, in fact you don't even need to know how to code, just load up a CSV in Google Sheets and drag it into Google Data Studio.

The hard part is knowing what information to surface, and how to drive the user towards those insights in an intuitive way. You need a strong team that intersects product, data science and UX. Engineering is the least important aspect of it.


The phenomenal cost of hosting low latency realtime dashboards for everyone is a real cost. Tons of memory required if you want them to open quick for everyone. I wish they could be served more dynamically like if you saw a user loging, you could probably populate the query before they got to the page or something. As it was it seems like we have to serve a zillion dashboards noone is actively reading.


Lesson learned: start with fewer metrics and observe how they are used and interpreted. It is much easier to expand correctly from there. Collecting requirements in a single pass and building a monolith is rarely as productive as it seems - because the barrier to adding things and shifting responsibility to the dashboard is so low in the beginning, that it can easily become a dumping ground.


imo there are three core pillars you have to get right here:

1. Relevant: Don't just build a dashboard for the sake of building a dashboard. First, understand what the goal of the user is, and what metrics they'll want to look at to understand their progress towards that goal

2. Reliable: You only have one shot to get this one right. As soon as you present incorrect data to your users, you've lost their trust forever, so make sure you have solid tooling in place across your data stack that ensures data quality, from collection, through transformations to query time

3. Accessible: The data the user will be looking at needs to be either self explanatory, or the user has to have access to documentation that describes the data they're looking at in detail.

For point 1/, here's a framework to help you identify which metrics to focus on: https://www.avo.app/blog/tracking-the-right-product-metrics


As a developer who works on a database management system monitoring tools, user-facing monitoring dashboards have been my bane for a while. I don't know much about the situation in other companies and products, but here are the main pain points I've encountered:

1. Nobody knows what to monitor exactly, every new dashboard is based on a guess.

2. Not much user feedback to base the decisions on if you don't have much users to begin with.

3. Often, the metrics exposed by the app under the monitoring prove grossly inadequate or suitable metrics do not exist.

4. You can't just add new metrics. Users have to update the whole distributed app for the new metric to become available. This has to be accounted for at the UI design stage.

5. Somebody has to spend a significant amount of time gathering all the information from random people in the company, because see 1.


I'm currently in the middle of building an overly complicated analytics platform where there is a "easy mode" and an "advanced mode" they pick the devices and metrics they want on the graph, and if they toggle advanced it shows the sql it used to create the graph. So then they can edit the sql or do whatever they want.

Giving customers "secure" sql access was a must have feature from upper management, and it was very tricky/a nightmare to get right.

Customers actually liked it though, sql is king.

Advise I would give is make sure your analytics api's data models and queries are well though out and extensible. It makes it very hard to change them and rework the ux.


This is completely an aside, but whenever I see "dashboard" I think of those colorful plastic toy dashboards that are given to children sitting in the back seat of the car, so they can pretend that they're actually driving.


Perhaps cynical, but in big corp tech, this is exactly how building dashboards can feel sometimes.


A common saying in statistical consulting is that the entire job is just asking "what question are you trying to answer?" over and over again.

Building dashboards that will actually be useful requires the same approach.


From my experience:

1. You want to build UI to be config driven. At some point, adding a new chart in code will not scale. Writing good config is hard and require a lot of careful thinking.

2. Product owners want special snowflake, try to push back on any customization as it increase complexity and make config harder. It is better to implement usable search, navigation, sitemap or focus on developer experience (CI/CD, feature flags etc.)

3. GraphQL is overrated, for complex charts with filters and multiple options it makes caching hard to use in practice. I would like to try tRPC next time or similar rpc based approach.

4. Performance impact of large bundles in minimal in practice. You can be shipping 20MB of JS to users, but inefficient re-renders/re-fetches will have way more impact that amount of code. For charting, I would try ECharts or any commercial WebGL based charting library. For tables, I would try something that mimic excel as closely as possible.

5. Centralize state of application via redux/signals/jotai. You want to have clear separation between config and state of components. You want to build this as early as possible. I guarantee that product would want to have URL sharing and adding this later is very difficult.

6. Designers love whitespace, You should fight for information density as much as possible. Design system sounds like a great idea, but it cost millions in practice.


Users often catch what they see as conflicts, and you need to answer for this.

Often it's something as a different interpretation of data in multiple places (revenue in one place, profit in another) or differing date logic (one query includes a date in the range, others are "up to" that date, etc). Caching is another issue, especially if you selectively cache only slow queries.

To minimize this, always have an explanation on the chart/card (even if it's hidden but clickable to show)


Some thoughts:

- a clean data pipeline is critical. Is your data pipeline manageable? Is it observable? Is it monitorable? Can you make changes quickly at different stages? How do overrides work? Does your data pipeline have entitlements? (Can private data points be provisioned to specific users?)

- Should you implement your own dashboard? Or are you reinventing the wheel? Can you reuse/recycle existing BI tools? What are the licenses involved? Power BI is proprietary to microsoft and will have per user economics. Grafana is AGPL, be very careful with anything AGPL in your tech stack because it may force you to open source your code. Apache Superset is pretty cool. I've seen big startup valuations with off-the-shelf BI tools. If its an MVP, definitely consider using this as opposed to rolling your own.

- Making assumptions for your users is bad because users will always ask for more. So building a flexible framework where users can add/remove visuals and build their own analytics may be necessary. The flipside is this adds complexity and can confuse the user. Its a delicate balance to cater to all types of users: the basic user vs the power user.

- How do users send you feedback? Bad data point? How do you find out? Can the user address it themselves?


No matter what you do, someone will use your dashboard to post-hoc justify a pre-made decision. When it all goes wrong you'll be blamed for making a bad dashboard.


One of the hardest challenges is ensuring alignment with the end user from ideation to delivery. It can be tough to figure out what the end user needs in the first place, let alone the details of how to define individual metrics or slice the data. This is a huge pain point for both externally and internally facing deliverables, but it's especially tough for external clients because you're likely a lot more limited in your ability to communicate ad-hoc to clarify things down the line. And once you've delivered something that's either irrelevant or inaccurate, then it can end up being game over for the engagement (if you're working externally) or your counterpart's trust in your output (if you're working internally).

So it's super important to get on the same page RE: goals and expectations and keep that alignment going to the end - so that there aren't any unpleasant surprises at the delivery stage. Some more on who to get involved and how here: https://www.avo.app/blog/who-should-be-involved-in-tracking-...


The discussion nicely points out all the things I also ran into over the last 5 years. This motivated me to summarize my experience:

https://jorin.me/gettting-started-building-a-data-driven-bus...

Thought it might be relevant to others here.


As someone who has made a ton of grafana dashboards over the years, be prepared for users to hold it wrong. Data visualisations should fail/degrade in clear and expected ways. Users are often surprised when dashboards/charts hit some limit (eg they write a non performant query). The big query design (async first, fair queueing) is best if you’re letting users write their own queries on their own datasets.


When I used to work with D3 I found object constancy to be quite an important principle. Transitions between state are often neglected (a full state refresh is easier).


Make everything exportable to csv / excel.

The ones who actually use it, you won't cover all their edge cases.


Similar to supervised and unsupervised learning, one can see dual paths on this journey. One path answers the questions which have been in user's mind. The other explores unasked ones to finds new insights.


No matter what the client says, ensure your prototypes load fast. I had a project turn sour because the C level test end users couldn't be bothered to wait 20 seconds, despite us telling them it was normal.


You will often have to polish the users’ half-baked metrics. Even large orgs with teams of business analysts will leave gaps not uncovered till part way through build.


This sounds like a super interesting insight, but I'm not sure I've fully understood - any chance you could provide an example?


Balm for my heart.

I'm looking after a decision support system at the moment, and am encountering all the challenges raised here. Glad to see my experienced is not unique.


#1 mistake people first dabbling with dashboards make is to show absolutely everything.

Don't do that. Show only the things users need to act on what's on the screen. Minimize the information, make it "glanceable".

If you have a troubleshooting dashboard, and you're showing 999 items with nothing going wrong, that one item that's actually wrong is not going to pop.


#1 lesson is "don't be distracted by cool figures." The actual important stats are just numbers, or a list of strings, or an iso timestamp, etc.

I regularly tell customers "Open up this JSON, hit control f and search for the stats that you need." And they're like "Thank you, you just saved me 50000 hour of work."


Every time I see these self serve systems implemented I do wonder if they are too complicated for the normal user to figure out. I do think that users can get useful insights from their company data, but the tools are just too hard to figure out. Qlik comes to mind!


james-revisionai captured most of the main ideas.

One thing not emphasized well:

1. Make it accessible. At some point, virtually all of us will have some form of accessibility issues. 508 compliance is a solid standard for this, though can be a pain to manage without starting with it from the get-go.

2. Make it tabbable (similar to accessible).

3. For development side, make it able to client-side OR server-side render -- not every dashboard will have or need a rendering server. In python, Altair is the only client-side rendering that is also interactive that I'm aware of. It's important for payload considerations

4. Related to 3 - consider payload considerations. Make it transparent, either in debug logs or similar, how large the elements passing across the wire are.


I had a read through some of the comments here and it seems to me that the predominant stance is that reports are only useful for understanding the what and the why is addressed through more focused, thorough analysis. Also, most comments mention self-serve analytics are silly because stakeholders often don't know what they're looking for or they rather have a summary.

I understand this, but I disagree with some of this or have trouble understanding how this can be applied in practice:

- Reports can absolutely be built in a way that is flexible enough to enable knowledge discovery. If instead of creating a chart that plots Conversion Rate over time, instead create a chart that plots a Primary Metric against a Primary Dimension and use parameters to allow users to choose what the Primary Metric and Primary Dimension are. This drastically reduces the maintenance costs of reports because you don't need to create more charts, rather you just need to make new data available.

- This design strategy can be expanded to Secondary Metrics, Secondary Segments and Splits to enable comparison between segments. This is a big step towards finding out the why

- If you're a big business with both a team of BI developers and a team of Data Analysts I can imagine you'll have plenty of resources to conduct more thorough analysis whenever they are needed. But if you're a startup, you probably have a few Analytics Engineers doing both BI development and analysis. How do you enable them to do both if stakeholders most often don't know what they need? You have to be efficient and I don't think that means having these few Analytics Engineers holding stakeholders hands through a series of discussions to figure out what the hell do they even need...

- Why would you not want everyone in the business to be able to discover new things in the data? Why only allow data analysts to do that? If you provide a platform that enables data exploration in a guided way to avoid wrong use/interpretation of data, isn't it best to open it up to everyone? More people looking into data = more hypotheses = higher probability that at least one of them will be proven and very impactful.

- I think there are different types of data work: setting up data architecture to collect and transform data into a format that enables easy analysis; building solutions for monitoring KPIs (the what); building solutions for understanding the drivers of KPI fluctuations (the why); advanced analytics to support decision making (the actions). My opinion is that the real value is in the last point. Whatever we can do to serve the other needs with minimal effort, we should do


Are you talking about internal or external users?


Good point, title's ambiguous. I meant external.


I started a company about this. We did OK. You can learn more in my profile. Happy to help if you think it's useful.


from what I've seen people just want the data dumped into an Excel document so they can do their own analytics.


make as many metrics as you can configurable. what I mean is that charts origin of data should be configurable,in its form and in its colors. also allow users to filter the data incoming to the charts, users love messing up with the data before exporting them to their pointless and boring powerpoint presentation


my advice is mainly about taste

i always assumed i have a good taste, and that my designs are good looking and should appeal to others

different people have a completely different idea on what is usable and what is a good taste, so just be open minded to listen and accommodate for the taste of others




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: