Remaining useful life (RUL) predictions are a key enabler for achieving efficient maintenance in the context of Industry 4.0. Data-driven approaches, in particular employing deep neural networks (DNNs), have shown success in the RUL prediction task. However, although their architecture considerably affects performance, DNNs are usually handcrafted by human experts via a labor-intensive design process. To overcome this issue, we propose a neural architecture search (NAS) technique that explores a search space using a genetic algorithm (GA). It automatically discovers the optimal architectures of Transformers for RUL predictions. Our GA allows an efficient search, by making use of a performance predictor, updated at every generation, that reduces the needed network training. To our knowledge, this is the first work to optimize the architecture of Transformers for RUL predictions using evolutionary computation. We evaluate the performance of the found solutions on a widely-used benchmark dataset, the CMAPSS, based on RMSE and s-score. In comparison with the state-of-the-art, the Transformer obtained by our NAS method outperforms other recent handcrafted DNNs in terms of RMSE, and is comparable regarding s-score. Our results demonstrate that the proposed method provides better prediction accuracy with less human effort compared to other data-driven approaches.
Evolutionary neural architecture search on transformers for RUL prediction / Mo, Hyunho; Iacca, Giovanni. - In: MATERIALS AND MANUFACTURING PROCESSES. - ISSN 1042-6914. - 38:15(2023), pp. 1881-1898. [10.1080/10426914.2023.2199499]
Evolutionary neural architecture search on transformers for RUL prediction
Mo, Hyunho;Iacca, Giovanni
2023-01-01
Abstract
Remaining useful life (RUL) predictions are a key enabler for achieving efficient maintenance in the context of Industry 4.0. Data-driven approaches, in particular employing deep neural networks (DNNs), have shown success in the RUL prediction task. However, although their architecture considerably affects performance, DNNs are usually handcrafted by human experts via a labor-intensive design process. To overcome this issue, we propose a neural architecture search (NAS) technique that explores a search space using a genetic algorithm (GA). It automatically discovers the optimal architectures of Transformers for RUL predictions. Our GA allows an efficient search, by making use of a performance predictor, updated at every generation, that reduces the needed network training. To our knowledge, this is the first work to optimize the architecture of Transformers for RUL predictions using evolutionary computation. We evaluate the performance of the found solutions on a widely-used benchmark dataset, the CMAPSS, based on RMSE and s-score. In comparison with the state-of-the-art, the Transformer obtained by our NAS method outperforms other recent handcrafted DNNs in terms of RMSE, and is comparable regarding s-score. Our results demonstrate that the proposed method provides better prediction accuracy with less human effort compared to other data-driven approaches.File | Dimensione | Formato | |
---|---|---|---|
paper_without_diff.pdf
Open Access dal 14/04/2024
Tipologia:
Post-print referato (Refereed author’s manuscript)
Licenza:
Creative commons
Dimensione
1.33 MB
Formato
Adobe PDF
|
1.33 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione