Co-authored by Jyoti Kanal
Over the last few years, we have witnessed an increased focus and reliance on data to drive decisions. More recently, low-cost storage options, increasing computing power, and easy-to-use data analysis and visualization tools have increased investment in harnessing data for decision-making.
Here are five tips to ensure success in your data to decisions journey.
1. Define the problem statement
Popular belief may hold that having all the data is the first step to decision-making. But this is fraught with danger, and not articulating a clear problem statement or objective will only lead to an expensive disappointment in the long run. It can lead you down a rabbit hole and result in unproductive outcomes.
Some examples of well-defined problem statements can be:
- I want to increase sales by 50% from Segment X on Channel Y
- I wish to investigate sales made to customers in Segment Z for profitability over time.
However, these by themselves are not sufficient. There can be many ways to interpret and achieve them. Hence a good problem statement must be segmented into sub-problems through a structured approach to problem-solving. Once these sub-problems are identified, it is essential to pick the ones who have the most considerable impact and go the farthest in achieving your objective. This is where appreciation of the business can help by identifying the sub-problems which matter the most.
On the other hand, if one takes the approach, ‘here is the data, let us get some insights from it’ – you don’t even know if that insight you derive has any business value. Therefore, work out what matters most, articulate the problem statement clearly, and then divide it into sub-problems to determine what will drive the most significant impact. Any other approach is a recipe for failure.
Top tip: Define the problem to a sufficient level of detail. Avoid premature ‘solutioning’ with data.
2. Define the KPIs without being biased by data availability
Once the problem or objective is sufficiently defined, define metrics to address the issue. These could be metrics to quantify the problem, measure success, or test a hypothesis’s validity. This step will be a consultative and iterative process with a functional SME until the definition is correct.
Go with the rule of 5 KPIs per functional area. Having more metrics will not help you solve the problem on hand any better but can easily let you get lost trying to tick all the boxes that don’t necessarily need ticked.
Don’t be limited by data availability or the technical feasibility of building it out. There is no need to retrofit existing datasets to arrive at a KPI. Metrics can be computed using data from proxy sources to be comparable and suitable for the context without arduous sourcing and wrangling data from ‘the source’.
Remember, it is not essential to have the correct data upfront, but very important to define the right metrics to aid your decision-making process.
Top tip: Do not let data availability or data quality limitations influence the KPI defined.
3. Define the KPI by linking it to the problem being solved
A fundamental point to keep in mind while defining a KPI is the “problem” being solved. We often see scenarios where one KPI has multiple different definitions as multiple stakeholders use it. It may well be the case, but then each of them is solving another problem. Therefore, such metrics are more suited to analyze a situation than to solve it.
If we look solely at this metric, we lose sight of allied issues and effectively fail to see the woods for the trees. A classic example is that of cycle time or fills rate for orders. Suppose the definition is set by each leg managing the order fulfillment and delivery to the customer. In that case, it will take far longer to uncover the root cause of the issue in the value chain as the issues at hand-offs are often missed in these siloed metrics.
Top tip: Link the KPI to the problem and not the functional area. Defining KPIs limited by their functional context restricts the use of KPI to address the issue in its entirety and cannot be reliably used by others in the organization.
4. Identify the frequency and grain at which data is required and model it.
Once the KPIs and wireframes are defined, identify the most reliable data source to bring this to life. Weigh in the effort in sourcing the data. Sometimes a proxy may be available and far easier to use and take less effort to source.
Also, more varied data or data in real-time does not mean more accurate outputs unless the problem warrants it. Marrying up data of different frequencies and types through modeling is required. Focus on the formulae to summarize and drill down.
For example, While drilling down profitability by-product, how do you allocate overhead costs?
Another classic example is that of end-of-day stock positions and sales. What should you do with stock values while reporting monthly numbers? While data for the stock value at month-end dates will be available, using an average for the month may make more sense.
Top tip: Data modeling is essential. Formula to summarize or drill-down metrics need to be ironed out early on and not after data is bubbled up for reporting.
5. Keep governance simple
The identified KPIs, wireframed visualization, identified, sourced, modeled, built, and eventually, visualized data will evolve and change as we move on to the subsequent or allied problems to solve. It is critical to have a process to govern the requirements, the metrics and their definitions, the data, the architecture, the metadata definition, and the quality. This process is essential to ensure we are using data to drive decisions. Identify owners and hold them accountable for each of the above aspects.
Top tip: Don’t make the governance so onerous that it impedes improvements and corrections.
Finally, it is important to remain objective. Decisions are not based entirely on data. Human intuition and emotions play a significant role in decision-making. Therefore, insights from data must be objective and not mere outputs from wrangled data to justify a decision.
If you are already on your data decisions journey and have made significant investments, do reach out to us to chat about ways to reap the most from those investments.
Nithya Rajaram is a Senior Principal in the AI&A practice at Infosys consulting. She is passionate about helping clients navigate the data to decisions journey and brings with her a wealth of experience in delivering data related programs in telecommunications, banking and retail sectors.
Jyoti Kanal is a Principal in the AI&A practice at Infosys Consulting with over 15 years of experience in the data and analytics space. She has spearheaded the delivery of actionable insights from data using analytics, business intelligence and data warehousing projects across telecommunications, pharmaceuticals, health care, and energy sectors.