*As seen in Inspectioneering Journal’s January/February 2023 issue.
Back in November, Inspectioneering and Pinnacle had the privilege of co-hosting our 10th “Meeting of the Minds” (MOTM) roundtable discussion; this time in one of my favorite cities in the world, New Orleans. This bi-annual meeting has consistently brought together a select group of leading mechanical integrity (MI) experts to discuss pertinent topics related to fixed equipment reliability and share their personal experiences and opinions. As with previous meetings, participants came from various sectors of the industry, including oil refining, petrochemicals, offshore production, and chemical processing.
Over the years, we’ve always tried to share key takeaways from these meetings with our readers because we believe the insights shared could greatly benefit the industry at large. Previous recap articles have summarized discussions on corrosion under insulation (CUI) programs, emerging inspection technologies, integrity operating windows (IOWs), corrosion control documents (CCDs), risk-based inspection (RBI), mechanical integrity project hit lists, and most recently, data collection and analysis.
This discussion focused more on Data Validation and was prompted by the following statement and question: at no point in history have we had access to more data about our assets than we do right now; but is it really helping or are we simply “data rich, information poor?” For over an hour, the participants openly discussed the importance of having clear and consistent definitions of what constitutes “good data,” as well as effective processes for identifying and addressing data that does not meet those standards. Additionally, they discussed the benefits of using automated systems for data validation and the challenges of working with large volumes of data. The participants also emphasized the importance of collaboration and sharing knowledge among industry professionals to drive continuous improvement and advancement in a rapidly changing industry.
Clear and Consistent Definitions of Good Data
Data is a general term used to describe the myriad of information being gathered, organized, analyzed, and used to make critical decisions in your facilities. But not all data is “good data.” During the discussion, one of the participants emphasized the importance of having clear and consistent definitions of what “good data” looks like. He noted that without a solid understanding of what constitutes valid data, organizations can struggle to effectively use the data they have available. This can lead to confusion and inconsistency in how the data is leveraged, and can ultimately result in incorrect or unreliable decisions.
Having clear and consistent definitions of valid data is critical to ensuring that the data is accurate and reliable. By defining specific criteria that data must meet in order to be considered valid, organizations can better-ensure that the data they are using is of high quality. This can include things like specifying acceptable tolerances for measurement errors, defining the acceptable sources of data, and establishing protocols for verifying the accuracy of the data. Clear and consistent definitions can also help ensure that bad data is properly screened out. By defining specific criteria for what constitutes valid data, and establishing clear guidelines for how to handle data that does not meet those criteria, organizations can improve the accuracy and reliability of the data they use, and ultimately make better, data-driven reliability decisions.
Automated Systems for Data Validation
The participants also discussed the benefits of using automated systems for data validation, noting that automated systems can help to ensure consistency and reduce the potential for human error, which can improve the accuracy and reliability of the data. Automated systems can be programmed to apply the same criteria and rules to all data, regardless of who collected it or how it was entered. Moreover, using automated systems for data validation can help identify potential errors or inconsistencies in the data, as they can be programmed to check the data for things like completeness, consistency, and accuracy, and can alert users to potential issues that need to be addressed. This can help to identify and correct errors before they impact decision-making, and can ultimately improve the quality of the data.
While today’s software and automated systems can do a lot of the heavy lifting, there is still a need for qualified personnel that can digest incoming data and make decisions based off of it. Especially now that facilities are having to marry existing conventional data with all of these emerging technologies like photoimagery, drones, and all sorts of other new, unconventional data. One participant admitted that “one thing his inspection and reliability team has found they are woefully underprepared for is to review and make decisions based on these new and emerging datasets coming in.”
Challenges of Working with Large Volumes of Data
The participants also discussed the challenges of working with large volumes of data, noting that managing and organizing large volumes of data can be complex and time-consuming, and that organizations need to have effective systems in place to ensure that the data is used effectively. Many facilities are collecting billions of data points, but their management systems are often not mature enough to fully leverage the data they have or identify the ROI they’re getting. “An important consideration is forethought,” said one participant. “For a lot of the data being collected, there isn’t enough forethought for how it is going to be used. How are we using it? How are we going to be using it in the future? And is it even needed?”
One of the key challenges of working with large volumes of data is the need for robust data management systems. These systems need to be able to handle the complex relationships between different types of data, and need to provide users with the ability to quickly and easily find the information they need. This can require sophisticated systems that can support the needs of different users across the organization, and can provide the right level of access and control to ensure that the data is used effectively.
Another challenge of working with large volumes of data is the need for clear protocols for data entry and verification. To ensure that quality data can be leveraged by all interested parties at your site, it is important to have clear procedures in place for all aspects of the data collection process, including the training and supervision of personnel, the use of standardized measurement instruments, and the procedures for verifying the accuracy of the data. This can help to ensure that the data is properly collected and logged into your management system, and can ultimately improve the quality of the data long term.
Data overload is also leading to lots of quality data being lost or left behind, but it’s important to remember that “old data can be used to ensure good decisions are still being made,” said one participant. “It’s not a one-time thing. We’re constantly trying to find clever ways to slice and dice our data using statistical analysis to see if we can wring any more value out of it,” another added.
Speaking on data “sitting on the shelf” with untapped potential, one participant said he thinks “the data we have in our RBI systems is underutilized. Our organizations tend to look at it as just the inputs to get the output of the inspection plan, not to better understand the condition of the individual pieces of equipment, and what’s driving those risks, and what can be done about it from a reliability standpoint, a project standpoint, and a budgeting standpoint. You could even use that same information to benchmark your facilities against each other. There’s just so much we can do with that information, but right now it seems like it’s just being primarily used to drive compliance tasks.”
Collaboration and Knowledge Sharing
During the discussion, the participants emphasized the importance of collaboration and sharing knowledge among industry professionals to drive continuous improvement and advancement in a rapidly changing industry. They discussed the benefits of sharing data and best practices within the organization and of networking with other industry professionals to learn from their experiences and share insights.
One of the key benefits of collaboration and sharing knowledge among industry professionals is that it can help organizations to learn from each other and improve their processes and systems. By sharing data and best practices, organizations can learn from the experiences of others, and can identify opportunities for improvement in their own operations. This can help to drive continuous improvement and innovation, and can ultimately help organizations continue to advance and grow.
Networking with industry peers can also provide valuable opportunities to learn from others and share knowledge and experiences. By attending conferences and workshops and participating in industry forums and online communities, organizations can learn about the latest trends and developments in the industry, and can gain (and give) valuable insight into the challenges and opportunities facing the industry. This can help organizations not only stay up to date with new strategies and technologies, but can also help influence the development of codes and standards that are fair and reasonable for all stakeholders, especially as it relates to governance on data collection and transmission practices.
This MOTM discussion emphasized the importance of data validation in the context of mechanical integrity in the oil and gas industry. The participants discussed the need for clear and consistent definitions of good data, the benefits of using automated systems for data validation, and the challenges of working with large volumes of data. They also emphasized the importance of collaboration and sharing knowledge among industry professionals to drive continuous improvement and advance the industry.
Everyone acknowledged that data validation is a critical component of ensuring the accuracy and reliability of the data that organizations use to drive decision-making. By implementing effective data validation processes and sharing knowledge and best practices, organizations and the industry at large can continue to improve and advance.
Inspectioneering and Pinnacle would like to thank all of the participants for sharing their insights and experiences. We sincerely appreciate your participation in these discussions and your dedication to educating and advancing the Inspectioneering community.