«

»

Data quality a priority for 2024

Share

Despite the hype surrounding generative Artificial Intelligence (GenAI), I am finding in my industry reading that many industry analysts are predicting that data quality (one of my favourite topics) will remain a key priority for this year – especially when it comes to data management and governance.

In a recent article on TechTarget, one of the industry experts quoted, Nick Kramer, the leader of applied solutions at global consulting firm SSA & Company, expects that adoption of AI will be slow despite how the mainstream market has started viewing it. If anything, he says, CIOs will focus on data management and governance. Given the importance of curating high-quality data and establishing data literacy considering the growth of unstructured data, this will be vital to transform into data-driven businesses.

For Kramer, an ‘innovative CIO’ is one that looks to overcome matters of bad quality data before going the GenAI route. If not, then the bad-quality insights and outcomes will only arrive faster in a more automated fashion which will negatively impact governance.

On data quality

In a TDWI piece, Rex Ahlstrom (CTO and EVP of innovation and growth at Syniti), predicts that data quality will finally become an executive-level discussion point. All I can say is ‘finally!’

Ahlstrom writes that ‘even if you have found the data sets that will be appropriate for training an AI model or digging for insights, your results will be poor if the quality of your data is poor.’ He believes that the ownership and quality of data are still too often ignored.

I also find in the organisations that I work with that, in many cases, data ownership is very ill-defined. Nobody wants to take the responsibility of owning the data, especially when it comes to its quality aspects. I often hear: “That’s IT’s problem”, which remains a challenging misconception, even despite advancements in business wide data adoption. In truth, it’s the business’ data and the onus and responsibility falls on the head of the business owner/head of the boardroom table to ensure that there are business processes and systems in place that address the data quality aspects. Where the systems may fall short of ensuring data quality at the time of entry, the business should partner closely with IT or the relevant vendors to ensure the systems enforce a lens of quality on the business’ data.

Ahlstrom goes on to write that over the coming months, businesses must collect and document data, metadata, processes, and business rules in the pursuit of data quality. Fortunately, many tools, systems, and frameworks can help in this regard. The challenge is that business owners shun that responsibility. But without these basic components in place, even AI models will be unable to produce insightful and accurate insights.

What’s ahead

In another TDWI article, David Stodder (a longtime TDWI analyst), reckons that ‘the whole data integration field is under tremendous pressure from applications designed to run 24/7, in addition to trying to support AI, machine learning, and generative AI applications.’

He believes that even though many companies have embraced the flexibility of the cloud, aspects like data governance and quality as well as metadata management have been impacted. Of course, AI and automation are being incorporated into data integration tools and platforms, but this does not address underlying data quality concerns. Therefore, we can never neglect the data quality aspects, as that indirectly affects the quality of the insights derived. We need to use that same technology to improve data quality as well.

GenAI and data engineering, architecture

In the same TechTarget article mentioned earlier, Tilak Doddapaneni, executive vice president and global head of engineering at Publicis Sapient, predicts that 60% to 70% of the upcoming efforts around GenAI needs will revolve around data engineering. According to Doddapaneni, CIOs need to put a core data architecture in place and develop processes for formatting data appropriately.

Ultimately, any successful data project requires the buy-in of the executive leadership. According to Stodder, ‘Technology is changing so fast, it’s critical to think about the leadership issues that might arise. Given that, it’s essential to always be thinking about what the outcomes are that you’re trying to achieve and how they relate to the business.’

This should hardly come as a surprise given that the data industry has been preaching this for years. Now is the time for executives to wake up and smell the proverbial data.

Leave a Reply

hope howell has twice the fun. Learn More Here anybunny videos