Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the disable-comments domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home/advancebranding273/public_html/wp-includes/functions.php on line 6114
How Data Sets From Diverse Geographical Locations are Centrally Collated – Automation & Instrumentation Update
Industrial Updates

How Data Sets From Diverse Geographical Locations are Centrally Collated

This article will discuss:

  • The importance of capturing and processing interrelated data from multiple facilities and processes
  • The Industrie 4.0 problems the industrial sector faces with data capture across multiple facilities and processing the information centrally
  • The solutions that can be applied to get data sets from diverse geographical locations to drive digital transformation initiatives

Industrial automation and the interconnected smart factory that defines Industrie 4.0 are concepts that apply to every area of a production line or industrial endeavor. Industrie 4.0 does not start and stop at individual facilities. It also takes into consideration the performance of multiple facilities with interrelated goals, the supply chain, and tiered service providers charged with meeting customer demand. The interrelated systems and processes required to successfully accomplish industrial tasks are part of the industrial ecosystem that Industrie 4.0 business models seek to optimize.

Pursuing digital transformation initiatives across multiple facilities in diverse locations

Enterprises with facilities in multiple locations generally approach the digital transformation process and the implementation of Industrie 4.0 models in phases. The first phase starts with implementing either an IoT or edge computing framework to capture and analyze data in the main facility to work out the kinks before deploying these solutions across other facilities. Once the deployment has been extended to multiple facilities, the questions begin to roll in.

Industrial enterprises routinely ask the following:

  1. How do we ensure we have one authoritative source of data which provides the insight required to make decisions for our multiple facilities?
  2. How do we ensure the big data sets captured at source retain their integrity before integrating them into a centralized location?
  3. What do we do with these large data sets we generate hourly, and is the cost of managing a centralized repository not too much?

Digital transformation initiatives that account for multiple facilities and processes produce terabytes of data hourly, which means the industrial sector must be prepared for an onslaught of data. With this onslaught comes multiple Industrie 4.0 problems which include:

  • How to accurately and securely bring these data sets to a centralized location?
  • How to create a single source of truth that supports the ongoing optimization tasks across multiple facilities?
  • How to analyze trade-offs such as the need to capture all data sets from a plant floor and the cost of storing, processing, and analyzing these data sets?

Understanding the Industrie 4.0 problems that industrial enterprises face across multiple facilities

The first of the Industrie 4.0 problems that need to be addressed is bringing distributed data from diverse locations together and analyzing them to gain insight into different but interrelated processes. The challenges industrial enterprises face can be grouped under two major categories:

  1. Capturing data from facilities with varying equipment and technological profiles and
  2. Transferring the captured data to a centralized location.

Varying equipment and technological profiles refer to the diverse shop-floor equipment in use across multiple facilities. For example, older facilities are more likely to have legacy equipment that cannot be connected directly to interconnected networks or the cloud compared to modern Wi-Fi-enabled equipment. A different process for capturing data and transferring it will be required.

The second of the Industrie 4.0 problems focuses on the need to have a singular authoritative source that transfers the insight obtained from analyzed data back to individual facilities for implementation. Although a centralized computing location provides a solution, the cost of analyzing every data set captured from multiple facilities leads to the third of the Industrie 4.0 problems that industrial enterprises face.

According to SnapLogic, the Industrie 4.0 problems with managing data from multiple facilities and processes means enterprises are missing out on an economy worth approximately $140 billion. The challenges also lead to resource wastage and duplication of effort across multiple facilities.

The solutions to managing Industrie 4.0 implementations across multiple facilities

In scenarios in which diverse equipment and technology profiles hamper the capture and transfer of data, OPC UA offers a pathway to capturing data from both legacy and modern equipment with diverse communication protocols. Unifying the architecture within greenfield facilities requires the deployment of smart HMI hardware such as the JSmart Series of products to capture machine data and facilitate data transfer.

OPC UA standards ensure the data capturing process within individual facilities can be harmonized before transferring captured data to a centralized location. Once a unified process has been implemented, industrial enterprises can then take on the challenges of developing a single source of authority and reducing operational cost.

For enterprises with multiple facilities, pushing 100% of the data captured from equipment, IoT devices, and edge devices to a centralized repository, whether cloud-based or on-premise is not a viable pathway to optimizing industrial processes. Setting up intelligent architecture that processes data at the edge of individual data sources across multiple facilities while transferring only relevant data to a centralized repository is required.

The symbiotic relationship between the cloud and edge computing provides the intelligent framework required to ensure only important data sets are sent to a centralized repository. Edge computing solutions can deliver decentralized data analysis while sending specific data sets to the cloud. The dynamism that edge computing provides and the deterministic network OPC UA pub/sub over TSN provides enable the exchange of data from a centralized repository to individual pieces of equipment across multiple facilities in different geographical locations.

Processing data at the edge also reduces the onslaught of data that keeps industrial enterprises confused about how to proceed with Industrie 4.0 implementations. It is for this reason that edge computing is an integral part of capturing and processing data from multiple facilities. The advent of 5G networks – which deliver low-latency and high-bandwidth data transfers at relatively affordable rates – provides adequate support for industrial enterprises implementing digital transformation strategies across multiple facilities. The speed and reliability 5G intend to offer will ensure large data packets can be transferred from multiple locations without the hindrance of wired networks and the unreliability of 3G and 4G networks.

Conclusion

Although the Industrie 4.0 problems of capturing and transferring data sets from multiple facilities are considerable, solutions to mitigating these challenges exist and are being developed to function optimally. The unification strides of the OPC Foundation, the extensive work of 5G service providers, and the decentralized nature of edge computing will ensure enterprises can bring the benefits of an interconnected environment to multiple facilities.