Contact us

Big Data Analytics Stepping into Process Automation Systems – Who Makes Critical Decisions?

09.08.2017| Article

In the digitalized process automation environment and IoT-world, subsystems and components like sensors, measuring instruments, valves, machines, etc., are becoming more and more intelligent and independent and huge amount of data is becoming available, so called Big Data. The factory level computers have almost unlimited computing power and data handling capabilities. Could we actually move all the intelligence to the top level or should we utilize even more the intelligence of the smart process equipment? And what is the role of a human being in the middle of the digitalization process? Big data and analytics SW-tools, are they software nightmares or concrete support for production quality improvements?

Some lessons learned in the past

In 70’s so-called mini-computers were used to run the production processes in a centralized factory level system, which, in case of even a small computer failure, caused the whole process to shut down for maintenance break and a loss of production for several hours. Consequently, the development of distributed control systems (DCS) took place and some basic HW&SW were moved away from factory level computers to improve the overall reliability and uptime. At the same time, the hardware units, communication channels and methods improved to tolerate demanding field circumstances. Digitalized communication “buses” were also a great step to eliminate disturbances.

Recently, the huge computing power in the upper level systems has led to solutions, where major part of the intelligence is in a one single computer. So, back to 70`s? Not really, but virtual servers are replacing multiple hardware, instead of having a dedicated process control computer for each critical part of the production process, typically a backup hardware included as a part of the system. Savings in CAPEX and maintenance are obvious, but still the question is the availability and reliability, if something fails. As a bug-free software does not exist (?), the failure analysis and testing after bug fixing, may take a lot of time, thus increasing the commissioning and maintenance costs. SW-development cost compared to HW-cost, quite a huge difference.

Many things have changed since 70´s

A lot has changed: more reliable and robust hardware and software, increased focus on testing, programming, better testing tools, and so on. Naturally there is no right or wrong answer to the question, where and how the intelligence and decision making should be located and organized. As always, it depends. The size of the application and system, the environment, reliability and quality requirements are all factors. The basic principle, in my opinion, is that the distributed process automation system is there to implement the “job orders” from the upper level, supervisory system and also to take care for the local control loops, local safety of the equipment, first level alarms handling, communications between various systems and equipment. Each critical process requires a computer HW&SW-package of its own with a back-up server. In case of a failure and shut-down of the upper level system, operators could run the process also manually in a pre-defined limited functionality.

The upper level computer system will view and assess the status of the whole process or factory. Optimized production rates, storage area utilization, transportation logistics and process performance reports are the fundamental tasks. In addition, quality control systems like SPC may be included in the SW. On the top of everything, as more history and real-time data is available, Big Data Analytics is getting more and more role, when predicting the process performance variations. As this kind of SW-products are available as cloud-based services, they are complementary to a factory-level supervisory control system, flexible and easier to maintain for the factory ICT-personnel.

The question remains, in which stage a human being should be involved

How much decision making power we can delegate to the computer and further down to the DCS and intelligent process equipment. In an ideal world, the computers in all levels, will analyze the cases and will take corrective control actions independently. The ideal world does not exist, but big data and analytics tools enable predicting of coming problems and gradually take various situations under control before anything happens. The role of a clear, indicative dashboard and user interface is the key to recognize and decide the need of human interaction with corrective actions. Later on some actions could be “automated” as well.

As a summary, the DCS-based process-level equipment and machine automation is a reliable, independent solution, to keep the process up and running. The supervisory control system or factory level computer with related SW-services will optimize the production unit performance by creating job orders and guidance to DCS, produce reports for the operators, and predict possible breaks and quality issues based on historical big data collection and statistical analysis. Human interaction is necessary, but much easier with a guiding, indicative dashboard created from big data analytics.


See more news

Jorma Tirkkonen

Chairman of the Board

CONTACT US


Quva Oy

Business ID: 2348506-3

Address: Sumeliuksenkatu 18 B
33100 Tampere, Finland

Electronic invoice address: 003723485063

Sales

Emil Ackerman
+358 45 2086 816
emil.ackerman(at)quva.fi

Juho Liljeroos
+358 40 7418 498
juho.liljeroos(at)quva.fi

Other contacts:
firstname.lastname(at)quva.fi

Order demo and other inquiries

äLä KIRJOITA TäHäN MITääN

Subscribe to newsletter