During the past few years, the industrial sector has been in the midst of one of its largest transformations: The use of data will initiate a new wave of value generation. Consulting firm Roland Berger estimates the net additional generated value for the Euro region to grow to an additional €1.25 billion by 2025[1]. In the future, outlined by analysts, machines (hardware and software) will be able to produce all types of goods in not only a highly customized manner but also in perfect quality, while not exceeding marginal costs. Looking at the current situation, enabler technologies like Internet of Things (IoT), big data and predictions have found their way into the manufacturing domain. While the majority of manufacturing executives acknowledge the importance of this transformation, only 5 percent of them are satisfied with their current digital strategies[2]. Existing Operational Technology (OT) and Information Technology (IT) systems are not designed to cope with the masses of data generated by fully connected shop-floor applications. Cloud computing is transforming IT. IT is centered on the idea of high volume and volatile streams of data and massive compute power to perform analytics, and ML/AI applications on top of the data. In contrast, OT is highly proprietary and a locally optimized technology.
Cloud computing will be necessary to handle the large volume of data for connected manufacturing equipment and the digital twin. (Image courtesy of Amazon Web Services.)
Large industrial software vendors like GE, IBM and Siemens, amongst others, have pushed the concept of the digital twin— a full integration of the physical with the virtual world. In essence, this means that all product design data is available at the time of production. As an example, the full design data of a car body is compared in real-time with the as-is built data in a car body shop. In addition,production-relevant data is fed back into the design process, also reflected as closed-loop engineering. This frontloading of information will allow better decisions at a very early point of the product lifecycle and generate additional value. With asset data available in real-time from production through data aggregation and IoT, predictive and prescriptive applications will generate insight to increase overall equipment efficiency (OEE) and lift value from manufacturing assets in operation.
But, what happens as soon as insight is generated and the status of the physical process needs to be changed to a better state? In manufacturing for discrete and process industries, the process is defined by fixed code routines and programmable parameters. It has its own world of control code languages and standards to define the behavior of controllers, robot arms, sensors and actuators of all kinds. This world has remained remarkably stable over the past 40-plus years. Control code resides on a controller and special tools, as well as highly skilled automation engineers, who define the behavior of a specific production system. Changing the state of an existing and running production system changes the programs and parameters required to physically access the automation equipment—OT equipment needs to be re-programmed, often on every single component locally. To give a concrete example, let’s assume we can determine from field data, using applied machine learning (also referenced as Industrial IoT), that a behavior of a robotic handling process needs to be adapted. In the existing world, production needs to stop. A skilled engineer needs to physically re-teach or flash the robot controller. The new movement needs to be tested individually and in context of the adjacent production components. Finally, production can start again. This process can take minutes to hours depending on the complexity of the production system.
Current production systems are trimmed to high stability and low variability. For example, automotive production has a plethora of product variants, but still has an inflexible production system. With a car model lifetime lasting several years, production managers have learned to live with this inflexibility and value stable processes. However, customer- and technology-invoked trends will increase the need of fully flexible industrial control systems. First, the speed of innovation increases due to improved design systems and customer demand. For example, the product lifecycle of Volkswagen’s Golf I was 10 years and has now been shortened to three years for the Golf VI [3]. Second, requirements for customization increase steadily. Third, intelligent algorithms will produce a steady stream of proposals for process improvement. If we assume that Industrial IoT will fulfill its promises, the majority of manufacturers will be able to gain constant insight. Companies that are able to execute these insights faster will have a competitive advantage. Additionally, every new state of a production and supply chain system will be seen as a new experiment and fed back into ML systems, ultimately generating a virtuous cycle of a self-improving system.
The billion-dollar question will be: How can we design a system that immediately implements new insights? Similar to how cloud computing allows access to IT resources, as if they were software in a fully virtualized manner, a concept I named Software Defined Automation can unchain industrial automation from physical limitations. To realize this, all control code has a digital twin in the virtual world. The local instance is constantly updated from the virtual master code. A virtual model of the whole production system provides context for control code—these manufacturing planning systems are already heavily used by e.g. line builders for automotive and provided by major PLM software providers.
Having a full digital twin of a production system, including control code, insights can immediately be pushed down to change the state of a production system. For the case of Amazon Web Services’ edge technology, Greengrass, we are talking about milliseconds for locations in mainland Europe. Comparing this with hours of lost optimal production time, the potential of Software Defined Automation becomes evident.
Still, human interaction will be required. Production systems will optimize themselves based on simulated and real experiment. Improvements will rapidly be propagated around the globe. Labor will optimize the learning, not the system. This could also differ over time or by external influence. In times where renewable energy was cheap, output could have been one of the core drivers for optimization, while the minimization of input factors could have been paramount in other circumstances.
With the release of edge platforms, the technology is here today to minimize the time from insight to reaction. The power of the cloud is and will be the core enabler to realize software-defined automation fast and at an affordable cost.
[1]https://bdi.eu/media/user_upload/Digitale_Transformation.pdf
[2] https://www.forbes.com/sites/danielnewman/2017/08/08/top-5-digital-transformation-trends-in-manufacturing/#6ff9b733249f
[3] https://www.sjf.tuke.sk/transferinovacii/pages/archiv/transfer/29-2014/pdf/251-253.pdf
About the Author
Dr. Josef Waltl leads the global partner ecosystem for Industrial Software at Amazon Web Services (AWS). Prior to AWS he worked in Siemens building on software strategy & M&A for PLM, Smart Grid and Mobility. He holds a PhD and MBA from the Technical University Munich as well as an MSc in Computer Science from the University of Salzburg.