Don’t show me the evidence. Show me how you weighed the evidence.

Sometimes we fool ourselves into thinking that if people just had access to all the relevant data, then the right decision – and better outcomes – would surely follow.

Of course we know that’s not the case. A number of things block a clear path from evidence to decision to outcome. Evidence can’t speak for itself (and even if it could, human beings aren’t very good listeners).

It’s complicated. Big decisions require synthesizing lots of evidence arriving in myriad forms, much of it opaque, from diverse sources, with varying agendas. Not only do decision makers need to resolve conflicting evidence, they must also balance competing values and priorities. (Which is why “evidence-based management” is a useful concept, but as a tangible process is simply wishful thinking.)

If you’re providing evidence to influence a decision, what can you do? Transparency can move the ball forward substantially. Ideally, it’s a two-way street: Transparency in the presentation of evidence, rewarded with transparency into the decision process. However, decison-makers avoid exposing their rationale for difficult decisions. It’s not always a good idea to publicly articulate preferences about values, risk assessments, and priorities when addressing a complex problem: You may get burned. And it’s even less of a good idea to reveal proprietary methods for weighing evidence. Mission statements or checklists, yes, but not processes with strategic value.

The human touch. If decision-making was simply a matter of following the evidence, then we could automate it, right? In banking and insurance, they’ve created impressive technology to automate approvals for routine decisions: But doing so first requires a very explicit weighing of the evidence and design of business rules.

Where automation isn’t an option, decision makers use a combination of informal methods and highly sophisticated models. Things like Delphi, efficient frontier, or multiple criteria decision analysis (MCDA) – but let’s face it, there are still a lot of high-stakes beauty contests going on out there.

What should define transparency? Participants can make their evidence transparent in several ways:

Level 1: Make the evidence accessible. Examples: Providing access to a report, along with the supporting data. Publishing a study in a conventional academic/science journal style.

Level 2: Show, don’t tell: Supplement lengthy narrative with visual cues. Provide data visualization and synopsis. Demonstrate the dependencies and interactivity of the information. Example: Provide links to comprehensive analysis, but first show the highlights in easily digestible form – including details of the analytical methods being applied.

Level 3: Make it actionable: Apply the “So what?” test. Demonstrate value. Show why the evidence matters. Example: Show how actions influence important outcomes. Action → Outcome

On the flip side, decision makers can add transparency by explaining how they view the evidence: What qualifies as evidence? Which evidence carries the most weight? Which actions are expected to influence desired outcomes? A structured decision framework requires this and more.

Posted by Tracy Allison Altman on 30-Apr-2017.

Photo credit: iStock.

Related Posts

1 Response

Leave a Reply

Museum musings.

Pondering the places where people interact with artificial intelligence: Collaboration on evidence-based decision-making, automation of data-driven processes, machine learning, things like that.

Recent Articles

finger pointing
19 August 2021
How to blame a robot for mistakes: Do your finger pointing properly.
photo of row of townhouses seen through fisheye camera lens
12 August 2020
How Human-in-the-Loop AI is Like House Hunters
coronavirus pandemic curve - johns hopkins
9 April 2020
Deciding while distancing: From data viz to the hard decisions.
soldiers looking at 3d map
26 February 2020
Will military ethics principles make AI GRRET again?
woman exiting revolving door
30 January 2020
Struggling to explain AI? Try this before|after strategy.