Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Mar 27;23(7):3516.
doi: 10.3390/s23073516.

Transformers for Multi-Horizon Forecasting in an Industry 4.0 Use Case

Affiliations

Transformers for Multi-Horizon Forecasting in an Industry 4.0 Use Case

Stanislav Vakaruk et al. Sensors (Basel). .

Abstract

Recently, a novel approach in the field of Industry 4.0 factory operations was proposed for a new generation of automated guided vehicles (AGVs) that are connected to a virtualized programmable logic controller (PLC) via a 5G multi-access edge-computing (MEC) platform to enable remote control. However, this approach faces a critical challenge as the 5G network may encounter communication disruptions that can lead to AGV deviations and, with this, potential safety risks and workplace issues. To mitigate this problem, several works have proposed the use of fixed-horizon forecasting techniques based on deep-learning models that can anticipate AGV trajectory deviations and take corrective maneuvers accordingly. However, these methods have limited prediction flexibility for the AGV operator and are not robust against network instability. To address this limitation, this study proposes a novel approach based on multi-horizon forecasting techniques to predict the deviation of remotely controlled AGVs. As its primary contribution, the work presents two new versions of the state-of-the-art transformer architecture that are well-suited to the multi-horizon prediction problem. We conduct a comprehensive comparison between the proposed models and traditional deep-learning models, such as the long short-term memory (LSTM) neural network, to evaluate the performance and capabilities of the proposed models in relation to traditional deep-learning architectures. The results indicate that (i) the transformer-based models outperform LSTM in both multi-horizon and fixed-horizon scenarios, (ii) the prediction accuracy at a specific time-step of the best multi-horizon forecasting model is very close to that obtained by the best fixed-horizon forecasting model at the same step, (iii) models that use a time-sequence structure in their inputs tend to perform better in multi-horizon scenarios compared to their fixed horizon counterparts and other multi-horizon models that do not consider a time topology in their inputs, and (iv) our experiments showed that the proposed models can perform inference within the required time constraints for real-time decision making.

Keywords: 5G; Industry 4.0; automated guided vehicles; deep learning; multi-access edge computing; multi-horizon forecasting; time series; transformer.

PubMed Disclaimer

Conflict of interest statement

The authors have no relevant financial or non-financial interest to disclose.

Figures

Figure 1
Figure 1
Use case architecture representing the AGV, 5G RAN, 5G MEC, 5G CORE, and ML modules.
Figure 2
Figure 2
Overview of the steps of the followed method, including: Data Generation (see Section 5.1), Data Processing (see Section 5.2), Model Training (see Section 6), and Model Evaluation (see Section 7).
Figure 3
Figure 3
AGV circuit in 5TONIC.
Figure 4
Figure 4
A comparison of the performance of fixed vs. multi horizon models is presented in terms of mean absolute error (MAE) for the prediction of the t + 15 s step. The results presented are based solely on the guide error as the input feature and a time-window size of 60 s, which was determined to be the best configuration. The highest level of precision was attained by the transformer that considered all features (TRA-FLAT) in its fixed horizon configuration, resulting in a mean absolute error (MAE) of 0.923. On the other hand, the best multi horizon model was the transformer that focused only on the time dimension (TRA-TIME), with a slightly lower MAE of 0.936, which only deviates by 1% from the best model—yet offers greater robustness and flexibility.
Figure 5
Figure 5
Comparison of the most efficient multi-horizon architectures and input time-window sizes in terms of the average mean absolute error (MAE) across all predicted steps. The performance of all architectures improved as the input time-window size increased while using only the guide error as the input feature. The best performing architecture, with an MAE of 0.918, was the transformer with attention to time features only (TRA-TIME) using a 60 s input time window.
Figure 6
Figure 6
Comparison of MAE values across steps in the forecasting horizon. The horizontal axis represents the second of each step and the vertical axis represents the MAE. The best performing architecture for multi-horizon forecasting was TRA-TIME with an average MAE of 0.918, followed by TRA-FLAT with an average MAE of 0.939, and finally LSTM with an average MAE of 0.951. The figure clearly demonstrates that the LSTM model outperformed the TRA-FLAT after the tenth prediction second, which is significant as a fully loaded AGV requires at least 10 s to stop.

Similar articles

Cited by

References

    1. Lim B., Zohren S. Time-series forecasting with deep learning: A survey. Philos. Trans. R. Soc. A. 2021;379:20200209. doi: 10.1098/rsta.2020.0209. - DOI - PubMed
    1. Mozo A., Ordozgoiti B., Gómez-Canaval S. Forecasting short-term data center network traffic load with convolutional neural networks. PLoS ONE. 2018;13:e0191939. doi: 10.1371/journal.pone.0191939. - DOI - PMC - PubMed
    1. Siami-Namini S., Namin A.S. Forecasting economics and financial time series: ARIMA vs. LSTM. arXiv. 20181803.06386
    1. Chen K., Zhou Y., Dai F. A LSTM-based method for stock returns prediction: A case study of China stock market; Proceedings of the 2015 IEEE International Conference on Big Data; Santa Clara, CA, USA. 29 October–1 November 2015; pp. 2823–2824.
    1. Sierra-García J., Santos M. Redes neuronales y aprendizaje por refuerzo en el control de turbinas eólicas. Rev. Iberoam. Autom. Inf. Ind. 2021;18:327–335. doi: 10.4995/riai.2021.16111. - DOI