in processes

Industrial Data Intelligence

If there is something that the new technologies in the digital era (IoT, robotics, digital twin, additive manufacturing, augmented reality, etc.) have in common, when developing new services within a software ecosystem approach, that is data. Data, data, and data analytics in all of them.

Data Intelligence is the core of Industry 4.0, the lifeblood of Smart Industry, where the use of data drives the new business models.

The key aspects of the RAMI4.0 architecture of Industry 4.0 and the IIRA architecture of Industrial Internet Consortium, tell us about the cycle of information, its value, and its quality, key elements for Artificial Intelligence to be effective as an essential technological enabler.

Data Intelligence is accompanied by data technologies, such as Big Data Streaming, Data Mining, Visual analytics, Data Analytics, Artificial Intelligence, and the analysis of Time Series.

The data revolution in the context of Industry 4.0 has led to changes in some of them: BIG DATA is now much BIGGER, and essential in the smart factory. Any factory transmits terabytes (TB) of data per day. We are facing such huge amounts of data that it is difficult to process them with a useful purpose. On the other hand, we have the granularity of the data origin: different industrial processes comply with different time cycles, many of which may even be non-deterministic, which entails the redefinition of the time series of I4.0 contexts.

I4.0 works with data at different levels that must be analysed separately and designed together:

Cloud computing where large amounts of data are extracted from different sources, all of which are processed in a dedicated server. These operations generally address the company and production.

Fog Computing is data processing at plant level or at least at production line level. The different phases of a manufacturing process are connected, thus optimizing the industrial manufacturing process.

Edge Computing is the data processing that is executed at device level, interacting with the physical process at sensor and actuator level. It goes beyond the IoT and requires specific developments, which, in some cases, may require cloud connection.

Inherent to mass data processing, is the WHAT FOR. In many cases, we want to search in the data the information that will help us in very diverse purposes, such as: manufacture better, sell better, optimize quality, and improve profit. In these cases, we turn to data science, to help to improve the industrial process. Here, we apply knowledge discovery techniques to seek correlations, information, and characteristics of the data, in terms of events of the industrial process.

Finally, let’s add a reference to Deep Learning algorithms. One of their characteristics is that they require huge amounts of data to be able to carry out effective learning. Therefore, it is worthwhile mentioning them in the Data Intelligence area, where, they also have a promising future not only in image processing, but also in learning from heterogeneous data.

What we are currently working on

In joining and additive technologies, a large amount of data is generated per second. In the case of SLM (Selective Laser Melting), we have a RenAM 500Q machine which simultaneously works with 4 lasers. Monitoring the data in real time is already an exercise of Big Data Streaming, Data Analytics and Artificial Intelligence, whose efforts are geared towards zero defect manufacturing.

In addition to monitoring, at Lortek we develop knowledge discovery algorithms that learn from scratch. These are able to detect stress in the process, predict the probability of committing faults under certain conditions, and advise on how to avoid them. We are currently working in several European projects in which we develop these technologies, such as HyProCell and DIGIQUAM.

Besides the welding and additive manufacturing technologies, at Lortek we work on industrial digitalization processes, I4.0. Industrial processes monitor thousands of variables at different levels (Edge, Fog or Cloud).

In the HyperCog project, our goal is to develop an industrial cyber-physical system (ICPS) that joins the Edge, Fog and Cloud levels in one hyper-connected network with real-time communications. This ICPS represents a real challenge for Big Data Streaming, where data processing (prediction, monitoring, optimization, etc.) can be either local and distributed.

Specific Equipment

Cutting-edge Equipment.

Lortek has several dedicated servers for mass data processing.

Publications and downloads

R. Moreno, Juan Carlos Pereira, Alex López et al.
Time Series Display for Knowledge Discovery on Selective Laser Melting Machines
20th International Conference on Intelligent Data Engineering and Automated Learning


Challenges to be faced in the coming years:

Redefinition of time series.

For industrial contexts. Data updates managed by events and connection with smartContracts.


This is already a reality in Industry 4.0, but it still has to be sped up and adapted to industrial needs.


Handling amounts of data means moving them, probably outside the facilities where they are produced. Nowadays, any development that entails networked communications, must be analysed and maintained against cyberattacks.

IoT is the data connector channel.

The redefinition of industrial processes depending on event-driven approaches in ubiquitous systems is a need today, to face the increasingly complex production needs.

Distributed systems.

Computation tends to be executed in different networked devices. Spark and Hadoop are some examples.