Due to unavailability of time-critical information, challenges like operational efficiency and safety affect the productivity of employees and the overall industry of manufacturing. With its ability to organise massive amounts of data, AI based cognitive manufacturing systems are highly valuable. But how far would they solve problems? Would cognitive manufacturing be truly beneficial?
Big data and artificial intelligence (AI) are giving a huge boost to Industry 4.0. Intelligent software solutions based on AI models can process high volumes of data generated by an industry to identify trends and patterns that can be used to make manufacturing processes more efficient and reduce their energy consumption. More factories are instrumented with IoT, resulting in an increase of the data volume.
To process this large influx of data and deal with the complexity of the analytics, conventional computing struggles. To pave the way forward to Industry 4.0 and beyond, cognitive manufacturing has to evolve.
Cognitive manufacturing infrastructure is built on the foundation of IoT and analytics. It uses the data across systems, equipment, and processes to derive actionable insight across the entire value chain, starting from design through manufacture till support.
First, let’s have a look at the challenges that current manufacturing processes face, and how these challenges affect the profit for industries.
Challenge of continuous process industry
The bigger challenge of this market is the low operational efficiency, which is affecting the earnings before interest, taxes, depreciation, and amortiation (EBITDA) by 4% to 7%. The next big challenge is the productivity performance, which is affected because of the complex and disconnected processes.
Various subsystems of the manufacturing plant do not bring the data to a central system for processing. Be it a predictive maintenance system for maintenance by original equipment manufacturers (OEMs) or other production processes controlled separately through a distributed control system (DCS) or historians, everything is worked on separate systems.
Over 90% of the data collected and stored in the plants is not being utilised properly for improving the performance of the plant, which may lead to an opportunity loss of $167 billion. If the losses are being compared, the size of the petrochemical market is close to $4,000 billion. This is the size of revenue being left on the table since the decision makers of the manufacturing plants are reluctant to take any chances of optimising the critical process parameters on a real-time basis.
Another challenge is a loss of the data silos. The productivity of employees is lost in processing information such as doing a root cause analysis as a post factor. This factor leads to a loss in revenue by $12.5 billion. The entire industry loses around $200 billion, which is the reason for impacting the EBITDA by 4% to 7% directly.
In the current scenario, the manufacturing plants have a basic ecosystem in place with availability of PLCs, integrated machines, and sensors. There are also certain plants with automated systems for maintenance, production, quality, and inventory. However, all of these are separate and are not integrated. There exist advanced ARP systems which talk about logistics, product development, sales and marketing. With very little data being shared with adjacent systems, all of these systems operate separately.
The lack of reliable oriented maintenance for uptime maximisation, predictability of process variability, and overall equipment effectiveness (OEE) losses affect the whole value chain. These aspects are vital because real-time process deviations (process disturbances) are very difficult to capture, and none of the systems that are designed on first principles or on mathematical formulae based solutions can capture this accurately. Low accuracy leads to lower throughput and energy losses. Solutions to these problems can reduce OEE losses across the value chain and impact the overall yield of the plant.
These are the problems that need to be solved, and cognitive manufacturing promises to improve the EBITDA by an average of 4-5%. The data from various systems has to be brought together to connect all the systems, and process it and operate with AI-led business processes, exhibiting the potential of AI in industries.
Addressing existing problems
The whole complex ecosystem of industries requires integration of various data systems. It is not just the sensor data system that needs retrofication. As many systems are analogue, there exist multiple interfaces because of various proprietary and automation systems such as DCS, SCADA, Historian, and PLC systems. With multiple protocols a simplification of this ecosystem can be done by customisation, bringing data from all the heterogeneous processes to a big data platform, understanding the business processes and gaps, and applying the predictive and prescriptive analytics.
The way to implement this approach is to first use the traditional data that is already present in control systems. If no such systems are present, then sensors need to be retrofitted. The data should be brought to a big data platform, and with real-time AI and ML models, a combination of both thermodynamics and chemical engineering applied. The data can be visualised via built-in or custom widgets and flexible dashboards.
The key is to reimagine entire industries, be it petrochemical, chemical, or pharmaceutical industries. Cognitive manufacturing can optimise process units such as industrial furnaces that are used in metallurgy, mechanical engineering, chemicals, ceramic, and building materials. Furnaces are critical components as these are responsible for around 60% of the energy consumed in the entire plant.
AI and ML hybrid models can take care of combustion optimisation, which is a key to reduce the carbon formation and increase the run-length of the furnaces. Also, the cognitive equipment performance, which is very important in charge gas compressors, boilers, etc on the utility, is important to control them efficiently. We need to ensure their heat recovery is managed well, as it can lead to lower energy losses and help improve carbon emission, which is the primary focus of most industries.
The next step is to bring real-time process optimisation. The critical process parameters can be optimised on a real-time basis to help improve the overall yield of the plant and directly impact the bottom line of the plant. Moreover, there can be vision based solutions for maintaining safety and quality. Lot of things done today by human intervention can all be replaced by vision based solutions.
All these features can come together as a single packaged solution to take care of multiple business use cases, including converting an entire plant into a smarter plant. This approach helps reduce costs and improve yield, uptime, and quality.
In nutshell, the solution involves real-time data that is driving process optimisation and asset performance optimisation on a real-time basis. Real-time data is utilised with the historical data to optimise the processes at the best possible level. This can help reduce the OEE losses.
Another thing it involves is AI, ML, and hybrid products, which will help in production process optimisation by real-time intelligence and practices, and the real-time warning systems to forecast downtime with predictive analytics. By predictive analysis one can configure the process parameters in a target range so that carbon emissions can be reduced.
The aspects discussed so far have a direct impact on operating efficiency by 10-15%, yield improvement by 2-4%, OEE improvement by 8-10%. Green-house gas carbon emissions are reduced by 5% but based on the potential it can reduce the carbon emissions even further by 15%.
Cognitive manufacturing applications can be applied across various types of furnaces, compressors, heat exchanges, boilers, and rotating equipment where failures are very common. If the solution is being taken care of from an end-to-end perspective, the impact can be somewhere between $20 million to $100 million a year.
Cognitive manufacturing can be applied in a variety of use cases. Let us take an industrial furnace as part of a use case and understand what goes in optimising such a component of the plant. In furnaces, combustion efficiency is responsible for 60-70% energy losses.
Starting from the quality of raw materials being ignored, reduction in the run-length of the furnace is significant. Therefore, making sure that raw material quality data is available from the laboratory on a continuous basis is the primary aspect. Since industries are unable to provide this data on a real-time basis, a change in management practices is needed.
Next, the analysis of the data performance is done with respect to outcome and identification of the critical process parameter that attributes to impact the overall performance of the furnaces or the plant. Based on that, AI and ML algorithms are designed. Various data points that come from sensors are also analysed. It is important to consider utilities and components like boilers as well.
It is also important to look at dynamic operations, which hold a lot of value at certain times. Let us take an example of temperature measurement. Most of the time in the metal or any such industry, dipstick based manual measurement methods are used. These manual measurements can lead to human error. None of the data analytics can work accurately if there’s an error introduced. Therefore, it is necessary to ensure that the data being captured is good. AI-led soft sensors can be employed to make sure that the errors introduced by humans can be reduced to a certain extent.
Finally, looking at how process disturbances can happen on a real-time basis, it should be noted that these deviations need to be optimised for a better outcome. If optimisation is achieved, it has potential to give better outcomes than certain tools that are being deployed in the plant.
To achieve all that has been discussed above, it is necessary to bring all the data into a singular source of tools. One should not only consider new technologies for implementing these systems, but the technologies should be a combination of what is already implemented today along with application of AI models.
Most startups and firms are trying to move on with AI, ML and deep learning models only. But it is important to still continue using mathematical models of the chemical industry, physical models, thermodynamic models, and even apply in some scenarios.
Avnish Kumar, CEO, LivNSense advises on how startups should form a successful proof of concept (PoC): “Simple solution is to try to solve business problems. The idea is to try to solve problems already existing in industries. In the early days, it is helpful to start visiting plants, try to understand problems they are facing, and see how the problems can be solved by your technology and your expertise.”
The physical and statistical models become the base of analytics. Then AI and ML models are applied at three stages. The first stage involves use of simple machine learning algorithms and observing the accuracy that can be achieved. Deep learning or smoothening filters, etc are applied on top of it to create a closed loop ecosystem or reinforce learning ecosystem. In implementing these algorithms, coding has to be very good for achieving high accuracy. Accuracy can vary significantly with algorithms, while affecting plant yield substantially.
Hence with all these in place a furnace can be made more intelligent, optimised, and efficiency can be improved.
Another use case is safety related applications. It is important to leverage deep learning based algorithms with AI and vision based cameras for the safety of the workers. Safety does not end with predictive analysis, and requires human detection aspects as well. Therefore, a complete real-time monitoring of the man-machine interaction with proximity detection is critical.
The image in Fig. 3 depicts one of the incidents that was captured at one of the construction sites of the metro station, where the person was partially seen. No vehicle driver can detect the person in such situations where there is hardly the hand being captured of the person working across. To avoid such fatalities, the goal is to make machines identify such kinds of events and autonomously take actions according to safety regulations.
Cognitive manufacturing systems have various applications and can be applied across various kinds of equipment. By leveraging multi-source OT data for real-time early warning systems, annual savings in operational expenses of up to $2 million can be expected. By combining real-time data with the critical process parameter data, which comes from DCS systems or historians, and building a predictive quality or predictive process optimisation, it can start delivering almost a couple of hundred thousand dollars in a few months.
Other aspects of cognitive manufacturing are in applications such as furnaces. These are the components of plants where optimisation is needed to reduce energy losses and carbon emissions. If all the learnings from thermodynamic principles are brought together to various best practices, it can really help in achieving operational cost reduction improvement by 20% and improving the run length of the furnace.
Finally, the operation aspect to it is where the instrumentation is becoming a challenge. Soft sensors can start becoming an alternative to it. Taking such areas into consideration, cognitive manufacturing can maximize profits.
The article is based on a presentation by Avnish Kumar, CEO, LivNSense, at the IoT Startups Show 2021. Darshil Patel, who has prepared this article, is a technology journalist at EFY.