«

»

Overcoming analytical program hurdles

Share

Overcoming analytical program hurdlesOver the last few years, we have noticed a sharp increase in the number of organisations that have shown an interest in implementing an analytics program with the hope and intention of gaining business benefit from it. However, as with everything ‘new’, there are always challenges that need to be overcome, before the real benefit is realised. So, in this post, I am exploring some of these challenges and how they can be addressed.

Data Quality and Data Integrity

Sometimes fields may have incompatible data, or the fields that have data, are skewed by outliers, missing or incorrect values. Another totally different problem we encounter often is where different departments have received different numbers for the same metrics. This, in my opinion, can be quite a dilemma, as it means that the data for different departments do not match and can even have some missing values – so how can it be used to provide clear results?

So the important first step to do in such a case is to define your data quality tolerances. However, don’t be alarmed when these tolerances change or have to be changed throughout the course of the project. Second step is to report data quality problems and get the business to address them. Analytical models run over bad quality data may result in equally bad outcomes. Surfacing the data quality measures will also enable you to phrase the caveats for the models you are developing, especially if the data quality affects the model outcomes.

To maintain trust over time, analytic predictions must also be moni­tored for continued accuracy, revised to address changing conditions, improved to use newly available data, retired when no longer applicable and replaced with more accurate approaches when applicable. The decision makers’ trust in analytics is directly related to how effectively the analytic results are monitored for continued accuracy. Unless this monitoring process can be automated, analytic value will be constricted by the data science team’s capacity to manually manage analytic models in production. Include plans for automated monitoring in your approach and implementation.

Data Definitions

You sometimes find that the same metrics have a different meaning for different people. In some organisations this problem is hardly ever surfaced, even though it exists. However, it is important to make sure that everyone is on the ‘same page’ within the context of a particular analytical program, as it makes it very complex if the same measure means different things for different people. Inconsistencies and misinterpretations of measures can have a disastrous effect on decision-making.

The approach here is to get a “central” dictionary of definitions that is agreed-on and signed off by all the representatives from the business. This has to become the central glossary and reference for all the definitions of measures and other KPIs used in analytics. In effect, you are starting off your data governance program (in the small) through this step – a very important step in any organisations information maturity journey.

Priorities and roadmap

When an organisation starts to implement an analytics program, everyone is excited and determined to use analytics and see how “makes a difference” in their part of the organisation. It is therefore of the utmost importance to make sure that the priorities are set correctly. When you start with an analytics program, the organisation should define a set of activities and priorities for the full journey – so that the business knows the road ahead.  What often works best is when the business leaders actively define what they want out of this program, which will ultimately make it easier to implement. This may change over time, as new insights are discovered, or as the business naturally change its course, but especially in the beginning there must be a target roadmap to work towards.

One needs to lay down a roadmap, even if it might change. The road map doesn’t have to be fancy, but it is important to know where you are going, and how you’re aiming to get there. A clear roadmap also helps to secure funding – sponsors are more willing to invest in an initiative if it the objectives, desired outcomes, method, role players and expected costs and timelines are clearly laid out. The best roadmap provides incremental value. This supports an agile approach. It also helps in creating a growing understanding of what business problems you’re trying to solve. It makes sense to put a roadmap together with incremental rollouts where you can measure specific outcomes that relate to the identified business challenges. The roadmap can even start with a smaller proof of concept, which can be useful as a small win to illustrate and justify value.

People Skills

Implementing analytics requires process, people, software, infrastruc­ture and data, but of these, people are the most important. Although it’s obvious that people are the largest component, the financial view actually undervalues the people. People do the discovery using their knowledge of the organisation and the data along with their analytics capabilities to develop valuable insight.

For me the first question is not how many people to hire. While this is a reasonable question, what’s more important to consider is the combination of skill sets and productivity of the people in the analytics team. You need data processing skills for data access and preparation.  You obviously need analytical modelling skills and data visualisation and presentation skills. Likewise, business and communication skills are required to manage the alignment between the business and the data science team. Skills are needed on the infrastructure side to operationalize analytics and drive them into business applications. But more than all of that, you require insight into and experience with the business strategy, its operations and the organisation’s culture. If you miss these latter aspects, the analytical outcomes may be impossible or unfit to implement.

Training is one way to address the skills shortages. Organisations that are data driven and action oriented typically believe in training. This is very applicable and necessary when establishing an analytics program. Industry assessments indicate that there is a correlation between training users to perform more advanced analytics and organisations taking action. As you move forward in analytics maturity, changes occur in the program, and hopefully in the business. The changes to the business and the impacts of applying analytical insights are the hardest aspects to find training for.

You also need to be careful to avoid depending on a single individual. To scale, an analytics team needs to be able to share its work, back up other team members, replace anyone who leaves, and grow its capacity over time.

Productivity

Another big hurdle in implementing an analytics program is overcoming productivity problems. Several issues can affect productivity, such as:

  • Manual processing of data to prepare it for analytics.
  • Working around shortcomings of the toolset.
  • Working with non-integrated tools – and connecting the dots between them.
  • Spending time answering questions that can be answered by a user provided with self-service capabilities.
  • Spending time writing and formatting reports that could done by the BI team or empowered users.
  • Waiting on data or waiting for jobs to run.
  • Slow processing of multiple iterations of analytical models during refinement.
  • Manually running and evaluating multiple models.
  • Manually converting models into deployable code, and testing the code.
  • Manually monitoring and validating results.
  • Maintaining and refreshing previously deployed analytical models.

 

Every moment wasted is a missed opportunity to find new insight, deploy the insight more quickly, and create organisational value from the insight. Productivity wasters are also a great source of frustration and discontentment among the members in the data science team. Automation is big enabler to increase productivity. Every step in the analytic process that can be automated will save a portion of your biggest investment in analytics – your people.

Infrastructure Growth

When your organisation plans their analytics strategy, infrastruc­ture is often viewed as a commodity component. It may be less important than people and data, but often the cost of growing infrastructure will delay projects and hamper their ability to deliver value.

Make sure your infrastructure is scalable and adaptable, and if it must, that it can include “big data” technologies like Hadoop. You need to manage against infrastructure costs and acquisition problems delaying value creation over time.

Deployment Challenges

In order to create substantial value from analytics, it is important for decision makers to understand and believe the insights were developed through a structured analytics process – in short, they must trust the insight to be willing to use it. Furthermore, the predictive insight needs to be available at the right time, in the right format and to the right person. For a large, one-time decision, this may be relatively simple – the analytical result can be directly communicated by the analyst to executive decision makers. The executives can ask questions to personally validate the prediction – to decide if they trust it enough to use as a basis for their decision-making process. However, this approach will not support a large number of on-going decisions or a large number of decision makers.

In fact, implementing any corporate strategic direction typically requires day-to-day decisions throughout the organisation to achieve the desired goal. Therefore, it’s important to envision and plan for automated deployment of analytic insights to the decision-makers requiring that information. The most convenient channel for distributing analytical insights is through the organisation’s business intelligence capability, provided that is trusted, up to scratch and functioning as it should.

Before the proliferation of strategic analytics, executives often thought of analytics as a set of discrete capa­bilities without much thought to the entire analytic process. Today, organisations that are strategically approaching analytics have recognized that the steps in the analytic process are interconnected, and the lack of integration can significantly affect the degree of manual work – and thus the overall organisational value. They have recognized the need to implement their analytics environment as an integrated platform that can interact with both their operational and productivity platforms – i.e., an analytics platform. An analytics platform, like operational and productivity plat­forms, provides an integrated environment that will support the entire analytic process from managing data and discovering insight to monitoring the results. The platform must maximize the productivity of your analytics team, easily grow and be supportable by your IT team. Also, it must easily interoperate with the operational and productivity platforms throughout the analytic process.

Deploying analytics often requires interaction with operational systems to deliver predictive insight into the business or clinical workflows. An analytics platform should provide many opera­tional deployment options, ranging from scoring an analytic model in an operational database to interacting with business rules to real-time analytic scoring. These capabilities include the ability to automatically turn an analytic model into software code that can run in the operational environment, saving time otherwise lost with manual coding, debugging and testing. Without the ability to interoperate with operational systems, full organisational value from analytics will never be realized.

Concluding remarks

The right team with a highly-productive analytics platform will support the strategic application of analytics within your organisation. With each successful project, your analytics team will help your organisation become more agile while continuously learning from your data. With each successful project, more data will become available so users throughout your organisation can independently answer simpler questions while collaborating with your analytics team on more complex and valuable questions. One needs to pay serious attention to data quality and data integrity.

Over time, your organisation will come to have more and more trust in analytic insights. It will trust that multiple analytic approaches have been considered, and the best prediction or recommendation has been found. It will trust the analytic opera­tional insight to support day-to-day processes to create better outcomes and results. Organisational trust makes analytics valuable, but it takes more than just an algorithm to create trust.

Always bear in mind that it is essential to be precise about what you want out of a program, as this will lead to better results.

What most people forget or maybe don’t even realise, is that implementing analytics can be an extremely complex and even difficult task. So working through these issues is a good reminder that it takes time and considerable effort to get it right. Sometimes I find that I get frustrated when it feels we are not making progress fast enough, but I need to remind myself that each step taken, is one step closer to the desired end result.

Leave a Reply

hope howell has twice the fun. Learn More Here anybunny videos