The goal of the research project »Reinforcement Learning for Complex Automation Technology Applications (ReLKat)« is to develop an AI process to help save energy in the fields of energy supply, building technology and industry. To bridge the gap between research and the world of industrial plants, Berlin-based experts in production science (Fraunhofer IPK), mathematics (Weierstrass Institute for Applied Analysis and Stochastics WIAS Berlin) and artificial intelligence (Signal Cruncher) are working together to make conventional control technology in existing plants fit for the execution of artificial intelligence and, in particular, reinforcement learning (RL). Two Berlin industry giants are involved in the project as application partners: the Mercedes-Benz Plant Berlin and PSI Software AG. ReLKat demonstrates how an interdisciplinary team from the Berlin-Brandenburg metropolitan region can overcome industry and domain-specific challenges – by using artificial intelligence!
The topic of energy efficiency is much more important today than it was several years ago, especially from an economic perspective. However, increasing energy efficiency in production by means of ongoing adjustments involves manual effort and can rarely be implemented in an economically viable way. This means that we have to depend on automation, but processing the data for real-time control is such a demanding task that our options are limited with conventional automation methods. This is why machine learning comes into play here, because it is able to map complex relationships and make them available for use.
We are able to analyze and evaluate data locally. Instead of transmitting the data to the server or to the cloud for centralized evaluation, we perform the evaluation locally, for example in the gateway of the household or the machine. This eliminates certain problems connected with data protection. It also increases reliability because the system continues to function even if the connection fails. This is also in line with the general trend of new technologies first being introduced in a centralized fashion and then decentralized over the course of further development, as is the case with means of transportation, to give an example. First came the railroad, then came the car. Or in the case of computers: First came the mainframe computers, then the personal computers.
Yes, because the data is no longer transferred to the program that analyzes it. Instead, the program moves to where the data is generated. This means the data no longer leaves its previous territory, and data protection is definitely guaranteed.
A new discussion between production science and legal science is about the use of operational data. Machine learning can be used to generate knowledge and experience from observing plant operations and make them usable for one’s own purposes. If this operational data from one client is then used to teach a procedure which is then applied by a second client, the operating data would also have been transferred indirectly. However, we only train our control algorithms on the system for which they are intended, and discuss transferability between similar plants of the same company with the users. At the same time, of course, such projects underscore that the data generated during operations has a particular value of its own. By making it usable, we see that data is a resource that brings potential added value, and therefore has its own inherent value.
Previous projects were still using conventional control technology. At the time, we used empirical data to generate statistical models to map the behavior of the plant at static operating points, but did not yet take into account the plant dynamics themselves. If we wish to map the plant behavior not only at static points, but as dependent on time, this new complexity forces us to use machine learning. By doing so, we tap into a novel solution set and take things up a notch in terms of performance.
We have been working on reinforcement learning for many years, and our aim is to make the core more stable and leaner. To achieve this, a special procedure based on hierarchical tensor networks is being developed over the course of ReLKat. This method will make it possible to keep the technology extremely lean, unlike computationally intensive neural networks, for example. This concerns the algorithmic side of things, the other side is what Mr. Thiele already mentioned: We would like to tackle industrial projects that we have not focused on so far. Fraunhofer IPK possesses a great deal of experience with energy consumption in the industrial sector. From this, we hope to see corresponding potential for sales and marketing in the future.
We at Fraunhofer IPK act as a practical bridge between the experts on the subject matter from manufacturing companies and the machine learning expertise of AI specialists such as Signal Cruncher. We conceptualize industrial manufacturing problems and identify system interrelationships and data in such a way that they can be formulated as a mathematical problem in the first place. It is this interdisciplinary approach between on-site machine experts, AI experts from a dedicated AI company, and Fraunhofer IPK scientists acting as a link between the two that gives a project like ReLKat the opportunity to develop practical solutions within just a few years.
The connection to WIAS came about through Prof. Reinhold Schneider from TU Berlin, a recognized scientist in the field of tensors. He was able to provide us with the contact to WIAS, which is important for the implementation of tensor networks for reinforcement learning.
In our search for suitable application partners, we decided to address both discrete component manufacturing and continuous process technology. Hence, our consortium includes not only application partners such as Mercedes-Benz with its Berlin Plant, which has already shown a high level of commitment to energy efficiency optimization for many years, but also PSI Software AG with its pipeline operations division. Doing so allows us to gain project experience in the field of continuous process technology as well as discrete component manufacturing.
The generic nature lies in the approach itself: By using machine learning, we observe the physical world and ultimately build statistical models. This has the advantage that we do not require much domain knowledge. The traditional way would be via physical modeling using differential equations. This method is costly and also comes with certain limitations. The principle of AI, on the other hand, is to learn about relationships simply by observing the interplay of action and analysis. This black-box character makes the solution inherently highly generic, because it does not really understand what it is actually doing – to put it in »human« terms.
Naturally, this does not mean that specific customizations are not still necessary. The idea of making a solution completely generic seems illusory to me. Of course, we also have to perform a selection of certain parameters and specify trajectories. Nevertheless, the effort required is much less than with manual methods or with a classical physical approach. In this sense, it is true: It is a fairly generic solution for minimizing energy requirements.
In addition, there is another good argument for the industry that makes our solution attractive: We tap into novel savings potential by adjusting aspects that were previously of a fixed nature. Based on energy efficiency considerations, we can now set a flow temperature to ten or fourteen degrees that had previously been set at twelve degrees for years and never adjusted manually or using conventional controls. This means that we are not replacing a functionality with AI that was previously realized in a different fashion, but instead utilizing the capabilities of AI to add a completely novel functionality – namely, this adjustment of target variables depending on energy efficiency considerations – to the existing automation system. That's why we and our work are perceived as helping and assisting operators, rather than competing with manual labor.
is deputy head of the Process Automation and Robotics department at Fraunhofer IPK. In several R&D projects, he and his team developed an intelligent, universally applicable framework that automatically increases the energy efficiency of plant operations.
is the founder and managing director of the Berlin-based startup Signal Cruncher. The company offers expertise in embedded machine learning for IoT and has made smart energy one of its core topics. With its XONBOT software, Signal Cruncher provides support for B2C and B2B customers in this field.