A data-driven enterprise, or data-informed enterprise, is one in which decision makers seek, and then consider, relevant data when making their decisions. The idea is that better information will lead to better decisions. But are your decision makers are actually capable of making data-driven decisions? This blog explores several criteria, all of which need to be true, for data-driven decision making to become a viable option within your organization. It also includes an assessment tool that you may download free of charge.
Can decision makers access the data that they need?
You can’t make a data-driven decision if you don’t have access to the right data. There are several potential issues to address:
- Does the data exist? Do you have the required data, or at least most of it, to make the decision? If not, do you have the ability to obtain it in a timely and cost-effective manner?
- Is the data easily accessible by decision makers? Although your organization may have the data, it may not be in a format or location that is easy for your decision makers to process. This is a fundamental reason why your organization needs to have a coherent data warehouse (DW)/business intelligence (BI) strategy in place.
- Do your decision makers have the tools to access the data? Decision makers will need access to the appropriate tools to access and then manipulate the data.
- Do your decision makers have the skills to access the data? The old saying “A fool with a tool is still a fool” is applicable in the data space. In addition to providing decision makers with appropriate data analysis tools you also need to provide training, coaching, and support to ensure that they’re able to use them effectively.
Do decision makers have the skills to work with data appropriately?
You can’t make data-driven decisions if you are unable to explore and massage the data appropriately. To do so, there are several potential issues to address:
- Do decision makers have access to effective tools, and do they know how to use them? Decision makers will need tools to analyze the data. Novice users will likely be best served with low code/no code data analytics tools and intermediate users will likely be satisfied with self-service BI tools. Power users will want sophisticated data analysis tools, maybe even a subset of what you would deploy to your data scientists.
- Is the data of sufficient quality to make decisions from? In the data community there’s an old saying “garbage in, garbage out.” The implication is that the data-driven decisions that you make are only as good as the quality of the data that goes into them. Unfortunately many organizations suffer from data technical debt (DTD), poor quality data.
- Do decision makers understand the data that they’re working with? I once worked at an organization where their data scientists invested, on average, between 4 to 6 weeks to perform the required data analysis for a single report. This was because the source data that they were using was complex and often of low quality. So, if data scientists required this much effort to understand their source data, is it reasonable to expect your decision makers to be capable of doing the same? Granted, this example is extreme, but it is common for data scientists to spend days exploring legacy data so as to understand it.
- Do decision makers have a sufficient understanding of statistics? One of the great things about modern data analysis tools is that you can explore and parse your data to a fine level of detail. The challenge is that your analysis may not be statistically significant, and if that is the case then making decisions based on that data becomes questionable at best. An insightful and valuable read is Calling Bullshit: The Art of Skepticism in a Data-Driven World.
Do decision makers understand the data that they’re being presented?
Many organizations are implementing reporting dashboards, a technology that I have promoted for years, to get information into the hands of decision makers. The fundamental weakness of such strategies isn’t the technology, but instead the people using the technology (an all too common challenge). The assumption is that the people to whom the data is being presented understand what they’re seeing, and that is often not the case. There are several factors which exacerbate this problem:
- Do decision makers have a sufficient understanding of statistics? See the previous discussion about statistics.
- Are decision makers willing to seek out corroborating information? A hard lesson learned in the project management space is that you shouldn’t manage by the numbers. With data-driven decision making, data is meant to inform your decisions, not make them for you. The data may tell you one thing, but you still need to do a reality check as to whether that makes sense or not. For example, in late 2020 a Canadian retailer was trying to determine the levels of garden furniture to pre-order so that it would be ready for sale in the spring of 2021. During the summer of 2020 they had sold out of pretty much anything pertaining to gardens and backyards. The sales data clearly indicated that there would be demand for garden stuff again this coming spring. But the purchasing manager decided to reduce their order from the previous year, not increase it, because they knew that the increase of sales in 2020 was due to people staying home during the pandemic. Their knowledge of the business environment told them the trend in buying garden furniture was over (and it was) which contradicted what the data was showing.
- Do decision makers understand the implications of what a combination of widgets/reports are telling them? The great thing about automated dashboards, self-service BI tools, and any other data reporting technology is that it puts data in the hands of decision makers. As the number of reports and data widgets increase, the number of potential data relationships increases between them. Have you ever looked inside the cockpit of a commercial airplane? There are many dozens of display widgets presented to the pilots, each of which displays potentially important information in real time. Understanding the implications of what each widget communicates, and more importantly what combinations of widgets indicate, is an important part of being a pilot and is one of the reasons why it requires months of training. Regardless of what Hollywood likes to imply, flying a plane isn’t simply about keeping an eye on the altimeter (although that is important). Similarly, what is being communicated by each of the widgets on your reporting dashboard? Do you really understand each of these metrics? Do you understand what combinations of them indicate? I’ve worked in many organizations where, with just a bit of investigation, we discovered that the decision makers didn’t understand what their dashboards displayed, even when they had been using them for years. You can’t make data-driven decisions if you don’t understand what the data is telling you.
Do decision makers know what questions to ask?
An interesting requirement for decision makers is that they understand what decisions need to be made and then what questions they need to ask to be able to make them. In short, do they sufficiently understand the domain? My experience is that this isn’t always true. For example, I worked with one organization where their marketing team was trying to develop a strategy for a new product they were bringing to market. Although they had done this successfully in the past, this time it was different because they were entering a completely new market for them. Being a data-driven enterprise they decided to gather data about the people in this new market space. The mistake that the marketing team made was that they explored questions that were appropriate for their existing customer base but not about these potential new customers, a group of people very different than their existing customers. Although it wasn’t quite this bad, imagine asking vegetarians how they like their steaks prepared, what steak sauces they like, and what cuts they prefer, and so on. They didn’t realize that they didn’t understand this new market well enough to formulate appropriate questions, and as a result they gathered the wrong data from which they could base decisions. The first version of their goto market strategy was disaster, wasting a lot of time and money, requiring them to rework their approach.
Do decision makers know what questions not to ask?
Interestingly, this is an issue that doesn’t seem to get enough attention in my experience. There are several potential factors to consider:
- Do decision makers understand, and will they adhere to, privacy and fairness considerations? On the surface you should be able to address this challenge through classic security access control (SAC) techniques – only give people access to the data that they should have. While this is a start, it rarely proves sufficient because it is quite straightforward to combine non-identifying information about someone and thereby invade their privacy. Furthermore, existing bias in your data can be exacerbated, often unknowingly, through data analytics. The solution is to provide decision makers with training on this topic (a great eye-opener is the book Weapons of Math Destruction by Cathy O’Neil).
- What organizational politics do decision makers need to navigate? Are some decisions already implicitly made? Are there pet projects that are going to happen, regardless of whether they make sense or not? Are some topics simply unpalatable for your organization to consider, let alone make decisions about?
- Are decision makers comfortable with speaking truth to power? A challenge with taking a data-driven approach to decisions is that sometimes that data doesn’t support the answer that your leadership wants to hear. I wish I could say that all leaders want to hear the truth, regardless of what it is, but that would be a lie. Sometimes the messenger does in fact get shot, so think twice before you bring forth an uncomfortable message.
Do decision makers know what the right amount of data they need?
When making decisions, you need to make a trade-off between time and accuracy. An imperfect answer at the right time to act is far better than a perfect answer once the opportunity has passed you by. Conversely, a decision made by the seat-of-your pants can often prove devastating. The implication is that you need to gather the right amount of data to make a good enough decision at the right time, something that depends on the context of your situation and the experience of the decision maker.
There’s an interesting saying in the metrics community regarding metrics dashboards: If you have too little information you’re flying blind, too much information you’re flying blinded. The implication is that you need just the right amount of data to make the decisions that are required of you.
Thanks to Kiron Bondale for pointing out that I missed this point in the original posting.
Do decision makers know that it’s about more than just data?
With a data-driven decision making approach, data informs your decisions, it doesn’t make them for you. Otherwise, it would be possible to simply replace the decision makers with an algorithm or an artificial intelligence (AI)-based solution.
As an aside, the move towards becoming a data-driven enterprise is an important step along the way towards enabling the adoption of a wider range of AI technologies. More on this in future blog postings.
Conclusion
Each of the criterion that I explored above are potential showstoppers. The good news is that they can all be overcome, albeit sometimes requiring a fair bit of investment to do so. The point is that becoming a data-driven enterprise may prove more difficult than you believe. My hope is that this blog has provided you with insights for how you can succeed.
Free Download: Data-Driven Decision Readiness Assessment Tool
I developed this assessment spreadsheet to help organizations assess their readiness for data-driven decision making.
Click here to download the Data Informed Enterprise Assessment tool.
2 Comments
Werner Schulz
Nice article with valuable set of questions.
Some remarks
– data are always from the past (as one example shows) – beware of extrapolating into the future
– data-driven rolls easily off the tongue but I agree that informed-by-data would be a more honest phrase
– right amount of data: agree – but there is still the widely held assumption that you need a lot of data. But sometimes very few data points (5) can be enough. It depends on the uncertainty (reduction) you can afford. See also Martin Lindstrom’s boook Small Data – The Tiny Clues That Uncover Huge Trends.
Chris Gerrard
One of the problems with the popular conception of “data-driven” is that it leads to the fallacy that there’s a canonical dashboard, or set of dashboards, that satisfy the information needs of the people who need the information the data contains.
This is almost never the case; there are almost always questions, some arising from contemplating the dashboards’ content, that the dashboards do not address.
In my experience one of the major failings of analytics efforts is the attempt to elicit from those who need data-based information the discrete set of dashboards that will satisfy their needs. This effort becomes an exercise in mirage-chasing; leading to an ever-expanding set of requirements pursuing a receding horizon — a never-ending Big Requirements Up Front exercise.