IIOT Power

How Utilities Can Better Manage and Maintain the Quality of Their Data Assets

Utilities are becoming increasingly skilled at adapting to changes brought on by the digital age: pressure from automation, disruption from new technology, and challenges with how to ingest, manage, and utilize mountains of data. Viewing “data as an asset” is becoming the new industry norm and utilities are beginning to invest heavily in digital tools and technologies to help them leverage their data to generate valuable business insights. Unfortunately, the value generated from that data is often limited by poor data quality, thereby greatly reducing the return on investment to the business. With the growing need to integrate data across different systems—both internal and external—and the expanding enterprise value chain, the need for trusted high-quality data is greater than ever before.

How can utilities manage the data quality of key enterprise data assets, and put proper controls in place to monitor data quality and deliver meaningful value to the business? They must work to set up a rigorous, end-to-end data quality management solution that is business-driven, sustainable, and that maximizes value by focusing on insight-to-action. Business and IT stakeholders must collaborate in developing this solution, one that is quantitative, focuses on the business impacts of data quality issues, works to determine root causes behind these issues, and uses this insight to drive action through a well-defined remediation plan. Data quality efforts should prioritize high-value data assets, those where issue remediation will drive fundamental change to critical datasets and generate value to the business units that own them. An example would be datasets that feed into multiple downstream systems, and assets that have compliance and regulatory needs.

Adopt an Adaptive, Quantitative Approach to Data Quality

A quantitative approach will not only help establish a baseline on current data quality but will also help to showcase how it improves over time. In addition, utilities can set enterprise-wide data quality targets based on the criticality of data and data owners can make meaningful investments to meet these required targets. We have developed a unique data quality scoring system called the Data Quality Index (DQI)—a score determined by how well the data performs across seven different data quality dimensions: accuracy, completeness, consistency, uniqueness, validity, integrity, and timeliness; collectively referred to as ACCUVIT.

To assess data across these dimensions, a series of business rules are developed that specify the exact criteria against which a data record is tested. Differing weightages can be applied to the business rules based on criticality when calculating the DQI for the dataset. With this system, utilities can incorporate DQI scores for their asset and work order datasets as part of their enterprise key performance indicator (KPI) metrics.

Focus on Moving from Insight to Action

Key to delivering business value from data quality initiatives is the focus on turning insights into action. Understanding the real business impact from identified data quality issues will not only help to prioritize resolution of issues but also ensure that time and effort invested delivers the most value. Similarly, identifying the real root cause (or causes) and taking appropriate measures beyond just fixing data errors (such as training needs, process gaps, and technology design issues) are vital to ensuring a sustainable long-term resolution to the data quality issue.

Enable a Sustainable, End-to-End Data Quality Management Solution

Data quality management must be end-to-end, covering all critical datasets across the enterprise, focusing on all areas from initial assessment to ongoing remediation and involving diverse stakeholders from business and IT. Only with a holistic approach to managing “data as an asset” can an enterprise sustain its data quality. To help utilities with this journey, we have outlined a four-step process to develop a robust end-to-end data quality monitoring framework. The steps are:

  • Assess. Gaining an initial understanding of the dataset environment (people, process, and technology), as well as an initial profiling of the dataset itself, helps to understand the data and its impacting factors. This is followed by deep-dive sessions with the business and IT stakeholders to define specific business rules across the seven data quality dimensions to assess the overall data quality and determine the baseline DQI score.
  • Visualize. Analyze the data quality performance of any dataset, drill down into the quality metrics and identify the real problem areas. For example, with the data quality visualizer tool, utility asset managers can view their asset data and compare the data quality index scores for their site to that of all other sites across the enterprise, diagnosing specific areas for improvement.
  • Remediate. As data quality issues are quantified, those failing data records must be fixed and the underlying issues rectified. In the example above, asset managers can see the outstanding data quality issues with their assets and track the remediation progress over time.
  • Continuous Monitoring. The data quality process is not a one-off process of assessing the data and remediating the identified data quality issues. Instead, it is an ongoing process that needs to be monitored over time to significantly improve and sustain the data quality of critical data assets. As remediation processes are repeated over time, the quality of the data will improve significantly. In fact, continuous data quality monitoring and emphasis on the importance of data often leads to fewer data entry errors, and therefore, results in better data quality (and fewer data issues) to begin with.

Utilities should treat their data much like they treat their physical (electrical/gas/water) assets and have a proper set of procedures and processes in place to manage and maintain the quality of their data assets. Often, the opportunities for improving poor data quality at a utility are not realized because a meaningful assessment has never been taken. While a variety of software offerings can perform various functions of understanding data maturity and assessing quality, in order to be successful, utilities must manage data holistically, leveraging a variety of tools and methods to best address data quality. Finally, communicating the value of high data quality across the business and then taking steps to improve it is essential to driving utilities toward being more digital and enabling a data-centric culture.

Ajay Jawahar, Spencer Borison, and Siddhant Pasari are energy and utilities experts at PA Consulting.

SHARE this article