Why is Operational Technology data (OT data) continuing to make major inroads into the Informational Technology (IT) world? What factors are causing OT data to leave its comfort zone and stepping into a new frontier? And why is this step imperative for the next generation of industrial automation?
According to Moxa, the reasons OT data is leaving its safe nest are closely tied to today's operational needs. It has to do with how we’re using OT data. To better understand, let’s consider the evolution of OT data, starting with Industry 2.0.
This era saw the introduction of electricity and assembly-line production, which prompted the invention of controllers to better manage devices on the factory floor as mass production gained traction. By using a simple on-and-off switch, these controllers helped complete simple routine tasks. Back then, data was few in quantity and stayed within the confines of the devices that generated them. In its infancy, OT data of the Industry 2.0 era was limited to activities within its own devices.
The third Industrial Revolution introduced system automation as the new standard. Advancements in computing and communication technologies enabled automated processes to connect and "talk to each other." This was achieved by adding sensors to the original controllers, a step that created a closed loop involving sensors gathering data and sending it back to the controller for some preliminary calculations. Based on these calculations, feedbacks helped complete the command. This signified the first step taken by OT data from existing solely within devices to it circulating between a few devices in a closed loop.
Along with further technological advancements, the evolution of OT data was inevitable. With the advent of Industry 4.0, or Industrial Digital Transformation, managers are no longer satisfied with only preprogrammed automation. They now demand a smart, self-thinking system, pushing mandates for OT data to the next level. So, the big question now is how to maximize OT data’s true value.
To achieve this feat, OT data has had to extend beyond its original dwellings and step into data centers or the cloud for further analysis, a move that can improve production efficiency, enhance quality, reduce costs, and even provide new business services. This kick-started a phenomenon that turned protected and localized OT data into usable data when transmitting it to remote IT systems and feeding it back to the OT side for real-time optimization. This phenomenon is what Moxa refers to as the Everlasting Data Stream, in which data continuously circulates between OT and IT, forming a lasting loop.
As an example, let's look at an auto parts manufacturer Moxa has worked with. This factory had “bottleneck equipment,” which meant all items had to go through that specific equipment during the manufacturing process. Therefore, if the equipment stopped, so did the entire production line, making maintenance of this equipment imperative. Considering the high stakes, the manufacturer wanted to perform predictive maintenance on it to anticipate the parts that might wear out and stock up in advance to avoid stalling the production because of shortages.
In the end, the manufacturer installed additional sensors to the bottleneck equipment. The necessary OT data could then be gathered in real-time and sent to cloud computing for analysis. This Industry 4.0 solution gave management insights into the status of every part, complete with the in-depth prediction analysis needed to take further actions.
Is OT Data Ready to Swim in the IT Ocean?
While moving data from an OT domain to an IT domain seems to paint a rose-colored world, a few considerations are required before diving headfirst into the deep end. Moxa proposes these three things to consider before taking this leap:
1. Data Interoperability
The first and most common challenge encountered when OT data is moving beyond its original confines is the interoperability of OT/IT data. Since the communication protocols used in the OT field are often vastly different from the ones IT use, systems often run into communication errors.
For example, when the OT data output shows a single “5" with no context, IT will never know that the number 5 represents "machine speed" thus stalling the lines of communication. For the OT/IT data cycle to materialize, and for the data to move among the two domains freely, preprocessing OT data is required.
2. Data Integrity
The second challenge is whether OT data can be transmitted in its entirety via connectivity technologies. This is especially challenging because of many disturbances found in OT environments, such as electromagnetic waves generated when devices start, extreme temperatures, harsh environments, and even the mutual interference between the control network and the OT data network, all of which may cause communication disconnection or instability.
Disconnection may cause incomplete or dropped transmissions, resulting in erroneous analysis and, by extension, erroneous decision-making. In cases like this, it is important to strengthen the "resilience of data transmission” to ensure that data is promptly transmitted in its entirety.
3. Data Security
Finally, the more valuable OT data becomes, the more important it is to secure its transmission. Cybersecurity was not much of an issue in the past since the physical barriers of a factory were most likely enough to protect the data that was contained within its walls. However, now that we connect OT data to IT systems or to the cloud, its old protective shells no longer work. Therefore, how to strengthen "OT network’s security" will become a required focal point for enterprises.