Traditional appliance recognition models in nonintrusive load monitoring (NILM) are constrained by a static label space, making them unable to recognize unknown or newly introduced appliance types. Incremental learning provides a potential solution by enabling continuous model adaptation; however, its application in NILM is often hindered by catastrophic forgetting, where learning new classes degrades performance on previously learned ones. To address this challenge, we propose a novel class-incremental learning (CIL) method based on a nearest class mean forest (NCM-Forest). The proposed approach redesigns the random forest (RF) structure by replacing axis-aligned splits with dynamic, centroid-based partitions. This design allows new classes to be incorporated seamlessly through centroid updates, while a partial subtree retraining strategy effectively balances stability and plasticity. Extensive experiments on three NILM datasets demonstrate that our method achieves robust and scalable incremental recognition with an accuracy of up to 93.33% pm ~1.52 %. Furthermore, deployment on edge devices confirms its practicality, featuring a low memory footprint, rapid model updates, and strong potential for real-time edge-based NILM applications.

A Lightweight and Forgetting-Resistant Approach for NILM: Incremental Appliance Recognition Using NCM-Forest / Yan, Zhongzong; Wang, Ze; Hao, Pengfei; Nardello, Matteo; Brunelli, Davide; Wen, He. - In: IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT. - ISSN 1557-9662. - 2026, 75:(2026), pp. 1-11. [10.1109/TIM.2026.3662894]

A Lightweight and Forgetting-Resistant Approach for NILM: Incremental Appliance Recognition Using NCM-Forest

Matteo Nardello;Davide Brunelli;
2026-01-01

Abstract

Traditional appliance recognition models in nonintrusive load monitoring (NILM) are constrained by a static label space, making them unable to recognize unknown or newly introduced appliance types. Incremental learning provides a potential solution by enabling continuous model adaptation; however, its application in NILM is often hindered by catastrophic forgetting, where learning new classes degrades performance on previously learned ones. To address this challenge, we propose a novel class-incremental learning (CIL) method based on a nearest class mean forest (NCM-Forest). The proposed approach redesigns the random forest (RF) structure by replacing axis-aligned splits with dynamic, centroid-based partitions. This design allows new classes to be incorporated seamlessly through centroid updates, while a partial subtree retraining strategy effectively balances stability and plasticity. Extensive experiments on three NILM datasets demonstrate that our method achieves robust and scalable incremental recognition with an accuracy of up to 93.33% pm ~1.52 %. Furthermore, deployment on edge devices confirms its practicality, featuring a low memory footprint, rapid model updates, and strong potential for real-time edge-based NILM applications.
2026
Yan, Zhongzong; Wang, Ze; Hao, Pengfei; Nardello, Matteo; Brunelli, Davide; Wen, He
A Lightweight and Forgetting-Resistant Approach for NILM: Incremental Appliance Recognition Using NCM-Forest / Yan, Zhongzong; Wang, Ze; Hao, Pengfei; Nardello, Matteo; Brunelli, Davide; Wen, He. - In: IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT. - ISSN 1557-9662. - 2026, 75:(2026), pp. 1-11. [10.1109/TIM.2026.3662894]
File in questo prodotto:
File Dimensione Formato  
A_Lightweight_and_Forgetting-Resistant_Approach_for_NILM_Incremental_Appliance_Recognition_Using_NCM-Forest_compressed.pdf

Solo gestori archivio

Descrizione: IEEE Transactions on Instrumentation and Measurement - article
Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 9.81 MB
Formato Adobe PDF
9.81 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/484370
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex ND
social impact