The 4M Approach to a Connected Plant

Technology is being incorporated into operations and maintenance schemes at an ever-increasing pace. New digital tools are helping optimize assets, improve performance, and reduce overall costs.

Big Data Best Practices

The Connected Plant Conference was held in Charlotte, N.C., in late February. The event featured presentations on some of the latest digital tools being utilized in both the power generation and chemical processing industries. Case studies were presented by several end-users explaining how technology has been implemented by their companies, the challenges they faced, and how problems were overcome. They detailed the benefits realized through technological initiatives and offered tips for others interested in going down a similar path.

The “4M” approach was a recurring theme during the event. If you haven’t heard of 4M, it stands for “Make Me More Money.” Craig Harclerode, business development executive with OSIsoft, was the first person I heard use the term during the conference. He suggested that a lot of companies try to implement digitization initiatives without having clear objectives. An edict may come down from above that the company wants to “Do Big Data … Do the cloud,” but there is no business driver identified.

“The bottom line: It’s about business,” Harclerode said during a presentation. “It should be led with a business problem, and it should be focused on making more money. Improving EBITDA [earnings before interest, taxes, depreciation, and amortization]. Improving yields. Improving safety.”

Once a company identifies a business problem and a corresponding digital solution, executing the best implementation strategy isn’t automatic. The right people need to be in the right roles to maximize benefits. Harclerode said the companies he sees getting the most value from digitization “evolve an OT [operational technology] chart of accounts, leveraging configurable, smart, digital twin templates, owned and managed by the subject matter experts, not IT [information technology].” He said IT plays a role, but the subject matter experts are really the people creating worth by configuring and managing the digital twins.

Data Lake or Data Swamp?

Once data starts rolling in, some new problems arise. Many leading organizations have shifted data storage to a “data lake” architecture. A data lake is a storage repository that keeps raw data in its native format. It uses a flat architecture, keeping data readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization.

That’s all fine and dandy, but Harclerode noted, “Operational data—time-series data—is not clean. There’s massive amounts of it; terabytes generated every day. You’ve got anomalies. You’ve got data quality issues. You’ve got missing sensors. You’ve got different scan frequencies. [There are] a whole host of anomalies that are very difficult to solve in a traditional data lake environment.”

This means companies end up with something more like a “data swamp,” Harclerode said. The problem is not insurmountable, but it can require utilization of fit-for-design technology to essentially create a data lake hybrid. Leveraging the technology properly, however, can pay dividends. It allows users to take linear-time events and perform layers of analytics—prescriptive, empirical, physics-based, streaming analytics in an infrastructure augmented by higher-level analytics like advanced pattern recognition, machine learning, geospatial dashboards, and multidimensional assessments. Finally, the tools provide a method of visualizing the data appropriately, so action can be taken.

Harclerode offered MOL Group, a Hungary-based oil and gas conglomerate, as “a poster child for layers of analytics.” He cited data presented by Tibor Komroczki, MOL’s head of process information and automation, during a 2016 OSIsoft Users Conference in Berlin, Germany. Komroczki showed how MOL increased EBITDA by $1 billion through digital initiatives enacted over a four-year period. “They truly moved to an event-based operational mode, and they have layers of analytics,” Harclerode said.

“[MOL] first and foremost focuses on human analytics, enabled by—and in—the OT infrastructure. Database decisions, real-time situational awareness, management by exception, driving a cultural change across their organization,” Harclerode continued. “They have over 27 tactical machine learning applications running today—probably have even more—and they’re not just running for a day or two, they’re actually institutionalized. And they’re looking at now going into the higher-level analytics.”

Protecting Your Data

Eitan Goldstein, industrial cyber and digital security director with Siemens, also spoke during the event. He said the digitalization process is not revolutionary, it’s evolutionary.

“I think it’s about small incremental steps, to get more out of what you have, to make smarter, faster decisions,” Goldstein said.

But 4M wasn’t lost on him either. Goldstein suggested digital technologies can help optimize assets, improve fleet performance, reduce maintenance costs, and minimize downtime. “It’s not just an act, but it is rather concrete cost savings,” Goldstein said.

Yet, Goldstein’s primary focus is on cybersecurity. Like Harclerode, Goldstein said most companies today are “drowning in data.” They’re dealing with thousands of cyber alerts, but what they don’t have is context. “Understanding what that data means is a huge challenge,” he said.

To get context, Goldstein said companies should “insist on dedicated, purpose-built, industrial cybersecurity solutions.” He also said clear ownership must be established for OT cybersecurity. Other best practices include using connectivity to gain insight, securing the edge, increasing transparency to improve visibility, and leveraging analytics.

“We can’t be having people walk around power plants with an Excel spreadsheet trying to keep track of everything you have,” Goldstein said. “The bad guys are too sophisticated.”

“Data is a strategic asset, and companies need to treat their data—especially their OT data—as a strategic asset with the proper governance, proper environment, and proper leverage,” Harclerode said. ■

Aaron Larson is POWER’s executive editor.