The importance of outcome driven analytics

Data is a fundamental asset that all businesses possess but few understand how to exploit effectively. The internet is abuzz with the importance of applying advanced analytical methods to unlock the “pots of data gold” at the heart of our organisations.

Unfortunately, we are often presented with the very few, extreme examples of larger corporations who have thrown significant amounts of money and time at identifying new opportunities and products locked up in their data. This leads to some core misunderstandings about how to approach these projects as well as what results we can expect.

Common pitfalls in data analytics projects

Good analytics projects are all about changing or supporting behaviours, so instead of focusing on the trend, the ratio and the dashboard we need to focus on what happens as a result of seeing those data points. There are two key issues we repeatedly in analytic projects.

Implementation masquerading as a requirement

We often hear a requirement along the lines of

“We want to see some dashboards and reports showing us trends and ratios”

As a requirement, this doesn’t seem unreasonable, but it’s actually referring to an implementation.  It says nothing about what we intend to do with those reports and dashboards, and it says nothing about what we expect to do as a result of seeing a trend or ratio.

Starting with the data or the process

In a significant number of cases, we see the IT teams taking responsibility for getting analytic projects off the ground and inevitably they start to look at the technology and the existing data warehouses and operational data stores. As such, the driving vision for the project starts with

Let’s have a look at all of our data and see if we can find some attributes that predict something useful

It’s usually followed quite quickly with

“Why don’t we just push all of the data into a big Machine Learning/Artificial Intelligence algorithm and it will tell us what’s important”

Without a scope or tangible outcome in mind, the project team will endeavour to find something useful but will often fail, or they will find something genuinely useful, but it will be so far from what the sponsor had in mind, that it will be ignored.

All of this happens before we’ve even considered the implications of addressing data quality, architecture, and format which often account for 90% of the effort in a project like this.

How do we avoid these pitfalls?

As with many projects, we need to stay focused on the outcomes and from there we can establish the requirements. At Outlier Technology, we start at the top by focusing on what we’re trying to achieve and what behaviours we’re looking to affect.  We recently engaged with a small content provider who wanted to understand what the impact of adding new features to their website was on new customer signups and existing customer renewals.  The initial requirement we were presented with seemed reasonable at first glance:

“Provide dashboards that allow us to track user signups and retention over time, and allow us to see when new features are added to the system”

However, it doesn’t describe the behaviours we want to change or support in reaction to seeing these dashboards and reports, and it doesn’t detail the real metrics and targets we use to decide if a feature is successful or not.  We reshaped the scope and focus of the project by asking a few simple but targeted questions  

  • How do we decide if a feature is successful or not?
  • What do we do if a feature is successful?
  • What do we do if a feature is not successful?
  • Who is involved in the decision to add or remove features?

Answering these questions allowed us to identify clearly the outcomes, and importantly, who needed what information at what stage to support their decisions.  With clear, outcome driven requirements, the we can accurately assess where we need to allocate resources and what type of implementation support the organisation needs; whether it’s the design of advanced machine learning algorithms or addressing shortcomings in data architecture.  This is very different from the all too common approach of starting with the data and looking for something “interesting”.

To further discuss how we can advise your approach to data analytics, see our contact page.