Alejandro Alija, Managing Director of Galeo, speaker at the Advanced Factories 2024 congress.

Last Thursday 11th April, the Advanced Factories Congress took place in Barcelona, which has established itself as one of the few generalist events that has survived the Industry 4.0 hype of a few years ago, and it seems that some people have not yet discovered that putting correlative numbers on it (5.0, etc.) is not going to make their product or service provide more value.

In its eighth edition, Advanced Factories has gathered more than 27,000 managers and professionals eager to explore the latest innovations in automation, industrial robotics, artificial intelligence and more, presented by more than 500 exhibitors.

During the congress, our Managing Director and founding partner, Alejandro Alija participated as Beckhoff Solution Provider in the Industry 4.0 Congress as one of the main speakers in an interesting round table organised by Automática e Instrumentación. He was accompanied by other industry leaders such as Sergio Hernandez from AG SOLUTION GROUP, Albert Forgas from Becolve Digital and Diego Jiménez from IGUS Spain, with David Marco Freire from BeDisruptive as moderator. Many thanks to Roberto Iraola Branch Manager and Xavi Martos, General Director of Beckhoff Spain respectively for the invitation.

In the discussion Alejandro highlighted that although companies have clear business objectives, they often lack the necessary understanding of how to approach technology and data projects, highlighting the importance of establishing a medium-term strategic vision, led by the company’s management, to ensure the successful implementation of digital solutions.

Alejandro Alija, Managing Director of Galeo, speaker at the Advanced Factories 2024 congress.,

The conversation also revolved around recommended architectures and services for the use of advanced analytics models in the manufacturing sector. Here, Alejandro advocated a hybrid approach that combines the power of cloud processing with edge execution capabilities. He stressed the importance of adopting technologies that enable seamless integration between the physical world of production and the digital environment.

Here is a summary of the main highlights of his speech:

Moderator: Are manufacturing companies ready to extract value from production data?

Alejandro: “No. Companies know what they want from a business perspective but given the complexity of the ecosystem, they don’t know how to approach projects. In a data project, as in almost any other, you have to be clear about certain starting indicators: What do you have in the field? What is your inventory of equipment, systems and data repositories?

The difficulty also lies in designing and executing enabler projects prior to the use cases. Enablers are always a hard sell, but they are necessary. Hence, the medium-term strategic vision set by the company’s management is vital”.

What are the most common use cases?

Alejandro, in an incisive and challenging tone, disagreed with the general trend by commenting that talking about general use cases does not serve any purpose. “Talking about energy efficiency or predictive maintenance in a general way does little or nothing. These are general domains. What adds value are descriptions of use cases with first and last names”.

How are the analytical models put into production, is the operator informed, are setpoints or parameters modified in the control system itself?

In Alejandro’s words: “Indeed, building models and operationalising models are two different disciplines, highly related, but different. And I think this is a mistake that many companies have made. Thinking that hiring and building a team of Data Scientists is all you have to do to implement advanced analytics in the organisation is a mistake. We call this “the experiment syndrome” or a little bit more macabre “the PoC virus”. There is no concept to prove for certain technologies and practices that are already mature in the market. What you have to validate is that firstly, you are able to implement them in production in your organisation, and secondly that they deliver or return or optimise what you expected.

On the second part of the question, he answered “the moment you are mature enough to have the model in production hooked into the productive data and services workflow, you have a lot of flexibility. The common sense, and we have seen it with some of our customers, is that in critical processes we start with generating an alert/notification to the operator and once the process is refined and the operator is trained, we move on to automatically manoeuvring the control system, the CDS, or whatever depending on the industry.

It’s not a very different process from what we’ve experienced with cars, cruise control for example, in the early days was an on/off automatism where the operator, the driver activates and deactivates, to become nowadays an integrated system with the ADAS and the rest of the car’s systems to fully manoeuvre based on the state of the road“.

What are the most recommendable architectures and services for the use of advanced analytical models in the manufacturing sector? Solutions on the Edge, in the cloud, hybrid?

“Of course, we are convinced that this world is hybrid and complex by definition. It is undeniable that manufacturing processes occur in close proximity to assets and that in itself defines a playing field, which is what we call the Edge. But at the same time, the concept of the Edge only makes sense if there is a Cloud (the hero or the villain or the particle and the antiparticle). I think the Cloud has given us wonderful, super-efficient lessons and practices for making SW. I think the right approach is to take those lessons, practices and Cloud technologies and bring them to the Edge. Let me give you an example: in the Cloud, thanks to the progressive homogenisation of technologies, we know perfectly well how to do software releases in an integrated, automatic, audited, etc. way. In the world of manufacturing, which is glued to assets in the environment of PLCs and local controllers… Who knows how to do modern SW release management? What is a CI/CD pipeline in a real-time environment?

This is precisely why we have chosen Beckhoff as one of our reference manufacturers right from the start. Why? Beckhoff manufactures real-time control equipment but based on a PC architecture. This makes it possible, for example, to run containerised SW (microservices) at the same time as deterministic real-time loads.

Ultimately, it’s very important that we think that our Edge devices are going to have to support SW executions at different levels, from the firmware level which will be the most critical and least extensible to modern SW based on microservices architecture running on the Edge, but for all intents and purposes governed from the cloud.”