Let your data tell a meaningful story


The old adage of “a picture is worth a thousand words” could not be more apt, or more evident, than when it comes to data visualisation. And with research proving that the human brain can process images that the eye sees for as little as 13 milliseconds, data storytellers are playing an increasingly critical role in bringing across key insights, visually, as effectively as possible.

Data storytelling is a methodology for communicating information, tailored to a specific audience, with a compelling narrative. It merges data science, visualisations, and the concept of a narrative to provide one of the most effective ways of sharing business information and driving outcomes.

When it comes to data, the reality is that data preparation on its own holds little value beyond those working closely with it. To unlock all its potential, a company requires a data storyteller that understands not only the data itself (science) but can pull it together visually (visualisation) to tell an important story (narrative) and help guide organisational strategy and decisions.

Data evolution

Anyone who follows my blog will know that the data visualisation topic is one I am rather passionate about. And so, when I came across this insightful article on the evolution of a data analyst to a data storyteller in three steps, I immediately wanted to share the insights and my take.

The author of this industry piece goes on to explain how to improve data visualisations using a simple procedure in Matplotlib. And while you can read the mechanics of this to do this for yourself, there are a few fascinating insights to come from the process involved.

The critical point is that visualisations are the key mechanisms to translate complex data outcomes into understandable business stories. For its part, visualisation becomes the fundamental tool required to enable data-related storytelling. According to the article, if a data analyst or scientist cannot visualise the results, then they do not know the results. The author writes that it takes a passionate and skilled data scientist to transform basic visualisations (which just about anyone can create) into a story that managers and customers will understand and get excited about.

This all comes down to three things noted in the article – adding information; reducing information; and emphasising information. Much of this revolves around improving the signal to noise ratio. This describes the amount of valuable information compared to the amount of unnecessary information. The article goes on to explain how to highlight all the important information and remove everything that does not add any real value.

At its core, this is what storytelling does. Nobody wants to read a novel that is poorly written or has weak characters and plot holes. Similarly, data storytellers must create compelling narratives where the ‘readers’ become passionate about what they are seeing.

Insights beyond

In his 1983 book, ‘The Visual Display of Quantitative Data’, Edward Tufte introduced the concept of Data-Ink ratio. Tufte refers to data-ink as the non-erasable ink used for the presentation of data. If data-ink would be removed from the image, the graphic would lose the content. Non-data-ink is accordingly the ink that does not transport the information, but it is used for scales, labels, and edges. The Data-Ink ratio is the proportion of Ink that is used to present actual data compared to the total amount of ink (or pixels) used in the entire display. In the book, he explains how to get more data (story) onto the graph and less graphic distractions.

Furthermore, Stephen Few has also written extensively on visual business intelligence, or rather, data visualisation as we know and love it today. I have in the past attended one of his excellent courses and have worked my way through some of his material – which is fantastic. In his most recent blog, he addresses the data storytelling attempt by some to compare the effects of the COVID-19 pandemic around the world. While it provides a fascinating read on how to make the data tell a more insightful story, the examples he creates highlights that it is not always about making things more complicated but adopting a simpler approach to provide more valid and useful ways to represent data.

Data visualisation and storytelling are fast becoming critical components for any business who wants to get a better understanding of the data at its disposal. In fact, I believe that this will be a vital resource for any organisation to create differentiation in this digitally driven world.

Data this and data that – tips to avoid data chaos


At the core of almost every business conversation centered around growth and competitive gain today lies the topic of data. It is the single most valuable asset to many businesses – not matter industry or size. Data has taken the world by storm and continues to shape how businesses operate and transform to ensure relevancy today and, in the years to come.

Fuelled by technological development, data is coming into organisations from every angle. Naturally, business users within the company want to be able to leverage this data to help them fulfill their role and support overall business growth and transformation. While the business shouldn’t necessarily complain about this, if not managed accurately, this ‘data this data that’ focus can result in organisational data chaos.

In an environment where data is uncontrolled and dispersed across the organisation, business users are troubled with having to identify between distributed data assets and try to make sense of them – like needing to determine which is the real master copy of the data or the real version of the truth – for the data to be effective. The bottom line is that this is not the role of the business user and having to spend time determining whether the data is up to date, validated or complete, is a waste of a scarce resource.

Instead, an organisation that is steering a tight data ship forward knows that the correct version of the data must be available to the business user for them to be able to carry out their actual job function/role – which is interpreting the data and using it in business-oriented decision-making.

While achieving this is no easy feat, it is most definitely a manageable one, when the right processes and technical solutions that support a data driven approach are in place. 

A dedicated data team

For a business to reap the rewards of a data focused approach, the organisation needs a team of data stewards. The sole responsibility of this team is to serve the business users by providing them with the accurate data they need, when they need it and in the format and level of quality they require, to be able to carry out their business functions and responsibilities. Setting the business users up with the wrong data or data of poor quality is only setting them up to fail.

A strategy for data management with the right tools

To be able to produce quality data and at the level the business users need, the data team needs to be working off data that is running through a dedicated and managed data system. Technical based data management solutions play a critical role in guiding an effective data driven business forward. Aspects like data inventory, data cataloguing or data dictionaries are central to this. Without this focus, the job of the data team is rather impossible, with a knock-on effect that impacts the entire organisation.  

Data governance the golden thread

Of course, none of this should take place if data governance does not fall at the centre of anything data related an organisation carries out. Policies, procedures and work approaches linked to data must be defined around data governance and compliance. And this must be driven by the C-Suite level, from the top down, in order for any data strategy to succeed.

While every business wants to be data driven to benefit from this asset that is only growing in relevance, it is critically important to not let data cause chaos in an organisational structure. To avoid this, I believe that a strong focus is needed on the above identified points – looking at the various technical solutions required in order to provide high quality, correct and suitable data to the business user, while being enterprise lead and with governance top of mind. It is only when such points are visibly actioned within an organisation can it really reap the benefits data has to offer.

To AI or not to AI – understanding the business case to see the benefits


Having not addressed the topic of AI in some time now, though in recently doing some research around AI, I came across a very relevant and interesting article. Naturally, my mind starting working overtime wading through my thoughts and opinions here, and so I decided to put them to paper and share some of my views.

There is no denying that AI is a very exciting topic and one that is fast taking over technology priority focus areas for any business on a structured digital transformation journey that puts the customer front and centre. In fact, the article I refer to above shares insights into some research undertaken that highlights nearly all of the mid-to-high level executives surveyed (99%) reported that at least one area of their company is currently utilising AI technology. And when looking forward, 89% expect the use of AI technology, across their company, to increase over the next two to three years.

While this certainly shows a keen appetite for AI, which I agree rightly exists, the article also notes that some people remain a bit sceptical to make the move. Having experienced this myself, and in reviewing the great insights shared from this research piece, I wanted to provide my take around some areas of AI that should be clearly defined, to allow a business to determine if they should invest in AI or not.

Decision-making around the investment in AI

Although AI is a technology and therefore naturally the technology division of the business tends to drive decisions over AI need and implementation, this should not necessarily be the case. A technology view alone could lead a business down a path that does not necessarily show good return on investment (ROI) on AI and therefore can result in AI being perceived as a costly task.

Although the technology department of the business are the users of AI, the insights delivered by or from AI (and how these are utilised in business processes and business strategy) are managed by the business decision makers. And this is where the real business value lies.

As such, I believe a two-pronged approach should be taken to AI implementation in business – an approach that sees IT and business working extremely close together to determine the need and use case for AI. The business decision makers alone cannot make decisions on which technology to use (and how) and neither should IT decide on which business problems/challenges or opportunities should be tackled using AI. These matters should be discussed holistically, and decisions made based on business objectives and purpose. Such an approach will only garner better AI decisions, results and ROI for a business.

The critical role of data to AI success

Some businesses are still slightly nervous about introducing AI into their systems and processes, for the fear of AI failure and the possible lasting impact this can have on customer experience and subsequently brand reputation. The research noted above highlights that 89% of respondents agree that if an AI solution doesn’t work well, it could hurt customer experience.

A business that wants to avoid this must take into account the fundamental role business data plays to the AI process, outcome and overall success. Data and AI are closely linked where the quality of the data directly affects the useability and effectiveness of the AI insights. An appetite for AI should never surpass the readiness of a business to implement AI. And if a business does not have its data house in order, the readiness for AI is simply not there. In my experience, a bad AI experience within the business creates a lot of mistrust in information and is a very hard hurdle to overcome.

AI can be a highly beneficial and critical tool for the modern digital business to gain a competitive advantage and build onto strategic growth. However, it is not a plug and play technology and the data component to its success must not be underestimated. Careful thought and consideration must be given to AI to ensure there is a clear need for it, it meets a business purpose and will produce a result the business requires. It is only through this understanding can a business decided whether it should AI or not.

Focus on the right trends for data success


In this post I want to highlight my personal views on some of the key areas that were highlighted by actual BI practitioners in response to the trends forecasted for 2020. Hopefully it will support other people in truly breaking through the clutter that the “trends jargon” can sometimes create, to get a better feel for where the focus on BI, and with that on data, should be placed.

A New Year has dawned and what’s exciting about 2020 is that it rings in a new decade – which reminds us just how much the world has changed and evolved over the last 10-years, especially from a technology perspective.

Of course, for the first quarter of any New Year, most business decision makers are buried deep in deciphering the big trends that are set to influence the months ahead, as they try to capitalise on the latest technologies which can be used to shape up business processes and improve profit margins. The reality is that technology is changing at such a pace that if businesses don’t keep up with the trends and make sure they are understood, they could be left behind – fighting in a game of catchup or worse, risking becoming redundant.

It is only natural to see several reports linked to technology trends and predictions when catching up on some industry reading. But one particular piece I recently came across, specifically speaking to trends in the Business Intelligence (BI) space, really fascinated me, as it hones in on the views of BI experts, consultants and/or vendors around some of the trending topics/trends put forward to them. The piece also shows how trends in the BI space are changing, which is interesting to see.

While we continue to hear the likes of cloud and the impact of cloud BI being thrown around as an example, the industry piece shares critical views on what the BI practitioners actually think – and it is this insight that business professionals really need to know and can benefit from, to support their planning around successful technology investments in 2020 and beyond.

Data Quality

An avid follower of my blog will know how often I punt the importance of data quality to the success of any data driven or BI strategy. Not at all to my surprise, data quality/management, along with data discovery/visualization and data-driven culture are the top three areas that BI practitioners identify as the most important trends in their work. Not analytics, not agile BI, not real-time analytics and interestingly, not big data analytics.

Any technology or aspect that has an impact on or can improve the process of driving quality data forward should be a key focus area for businesses to leverage, to ensure that they are consistently developing forward looking data strategies that will produce results previously unheard of. I am a firm believer that the key to data success lies in data quality and it is so reassuring to see that data quality remains a key identified focus area for BI professionals for 2020 – and I myself hope to support more businesses in getting this right as the year progresses.

Data-driven culture

Establishing a data driven culture has been marked third in terms of importance. It is easy to understand why, when a business is trying to reach a point where its data and the outputs drive business success forward. What does become important to grasp, however, is what it takes to truly become a data driven business and how to build that data driven culture properly. It is beyond technology and enablement, and rather comes down to placing a focus on people, and this requires several considerations.

Getting this right is a strategy all on its own and a business must realise that this takes time and effort. For instance, it will not happen overnight or even over a few weeks, but rather over a period of time where the right and correctly thought-out measures to achieving a data driven culture have been initiated and substantiated in the business.

Self-service BI not a priority

While there is certainly merit in some of the self-service technologies and rightfully the concept has gained traction over the years, I am also not surprised to see that the professionals are not rating Self-service BI as such a high a priority trend going forward.

Although businesses want to allow every employee in the business to be able to benefit from data and get to this state quickly (and often the perception is that this supports driving a data driven culture) my experience is that the self-service opportunity has ended up creating much confusion, which can lead to substandard results. The self-service idea needs to be clearly understood for a business to see true value. And if a business is only at the beginning of their data journey or is still trying to establish a data-driven culture and data strategies focused on quality first, then the idea of self–service should not feature until later down the line, if at all.

2020 – and the next decade – will see more technology progression, and with that more trends come to the fore. Data sits at the centre of this and so it is an exciting time for anyone involved with data or specialising in the data field. But it is also an exciting time for business. There is so much opportunity out there and so much to be gained – but the right focus is needed to see the real value.

Data is core to EPM


Something my avid readers may not yet know about me is that I have always had a particular interest in the world of Enterprise Performance Management (EPM) – in particular, the role of quality data within this space. In fact, I actually present a detailed lecture on the topic of ‘the role of Business Intelligence and Analytics in EPM’ as part of the Masters of Business Analytics programme at a local university in Australia.

It is with this in mind that an article addressing the results of an Enterprise Performance Management Market Study caught my attention recently. It provides a solid outlook for this space and talks to a few points of interest that sparked my thinking and to my mind, further emphasises a very valuable role the traditional Business Intelligence (BI) process plays to achieving largely successful EPM. And so, I simply couldn’t resist sharing my views.

The piece notes that according to the study, 38% of organisations already use EPM software, while a total of 30% percent are either currently evaluating, or may use enterprise EPM software in the future.

Point of interest #1: Basing EPM off the back of solid BI

While valuable EPM software packages exist, in my experience, many organisations tend to run EPM using typical strategic performance management frameworks, such as Balanced Scorecard, implemented using ‘conventional’ BI dashboards, which obtain data from the enterprise Data Warehouse.

This way of implementing or running EPM is something I strongly advocate, as it offers a win-win situation – for the organisation’s management, as well as for its BI competency.

You see, if an organisation can manage to successfully run their EPM off their data warehouse, using their BI tools, it means the data warehousing and BI teams are working together to provide management with that key business critical element – trust in the data and information resource. Likewise, doing so shows a very good return on that investment made into BI.

Within my lecture, I emphasise these points by stating that:

  • Properly implemented EPM considers the critical role of quality data and can lead to a larger return on investment from the organisation’s BI technologies.
  • Synergies between EPM and BI can lead to the improved strategic performance of the organisation, to aid the organisation in achieving its set KPIs and objectives.

Point of interest #2: Technologies of the future

Operating in a digitally driven world, which creates more data on a daily basis, makes it is only natural to wonder what impact emerging and/or disruptive technologies will have to this EPM process. And when we talk about the latest in technology innovations, just like this article points out, Artificial Intelligence (AI) and Machine Learning (ML) comes to mind.

When asked about the potential impact of these technologies to EPM, the article highlights that 29% of respondents see significant potential in AI and ML, while 21% feel that users will resist its adoption and 50% remain undecided.

But I see this stretching a little further, and in bringing it back to data (as a core component to successful EPM), also wonder how advancements in analytics will impact EPM – especially if it is being implemented off the back of the organisation’s BI solution.

The reality is that as analytics advance, many organisations have started using advanced forms of analytics, especially in planning, budgeting and forecasting. Doing so is allowing the organisation to achieve the following key criteria:

  • Create measures and assess soundness
  • Spot trends, identify relationships, calculate rations and quantify variances
  • Model new operational metrics and raise alerts when conditions diverge
  • Communicate analytical results and impacts (e.g. through visualisation)

These outputs are all valuable to the broader EPM process and can add significant value to the business bottom line. But it doesn’t end here. In taking this even further, we are starting to see a discipline of analytics, called Performance Analytics, or Analytical Performance Management, emerge. This is focused on using sophisticated mathematical, statistical, or econometrics methods for understanding and exploring the dynamics of performance factors – and is simply ‘life changing’ to the standard EPM process.

In applying Analytical Performance Management, an organisation can:

  • Discover new or often hidden business dynamics at both strategic and/or executional levels
  • Deliver crucial information to drive decisions and actions within performance management
  • Effectively support the understanding, exploring and exploiting of business dynamics and opportunities

Traditional EPM tools will always have a place in business, but as the world of technology innovates, so EPM should evolve, to support an organisation in planning today and well into the future. Data is the glue that holds successful EPM together – but it has to be managed and rolled out within the appropriate manner to achieve the results that can move the organisation beyond great and towards exceptional.  

Non-technical skills required by a data scientist


 If we had to name one technology related aspect, over and above Artificial Intelligence (AI) of course, that has seen the industry abuzz with excitement and progression, I am sure that like me, for many of you the term Data Scientist comes to mind. However, in addition to all the analytical and statistical skills, a full rounded data scientist needs some essential non-technical skills too.

Read the rest of this entry »

Ethics are at the core of the CDO’s responsibilities


Operating in a digital era where data has become a tradeable commodity and quantifying the value of data a business-critical focus, the role of the Chief Data Officer (CDO) has come into the limelight sharply over the last few years.

Read the rest of this entry »

It all comes back to Data Quality – part 2


In picking up the discussion from where I left off last month, before I introduce the fifth to seventh Cs of the 7 Cs of Data Quality by Melissa, I felt it important to emphasise a very pertinent point to the overall success of data to a business’s growth strategy.

Read the rest of this entry »

It all comes back to Data Quality – part I


Data quality is a topic I have touched on a few times in the past, and so it will come as no surprise to you that I am focusing on this critical component of data, once again. While it was not initially my intention to discuss data quality for this monthly post, I simply could not resist it, having come across a whitepaper that speaks to The 7 Cs of Data Quality, by Melissa, that sparked my interest.

Read the rest of this entry »

Older posts «

hope howell has twice the fun. Learn More Here anybunny videos