Meeting abstract. No PDF available. Using ultrasound for point-of-care lung assessment is becoming more and more relevant. Today, the outbreak of the coronavirus disease 2019 (COVID-19) has spread over the world at a very high rate. The most severe cases of COVID-19 are associated with lung damage such as ground-glass opacities and areas of lung consolidation, leading to acute respiratory distress. During the COVID-19 pandemic the need to detect and monitor the lung state is critical. Changes in the COVID-19 lung structure modify the way ultrasound propagates in the lung and are reflected by changes in the appearance of lung ultrasound images. Vertical artifacts known as B-lines appear and can evolve into white lung patterns in the more severe cases. Currently, these artifacts are assessed by trained physicians and sonographers, and the diagnosis is qualitative and operator dependent. We propose an automatic segmentation method using a convolutional neural network, to automatically stage the progression of the disease and predict the severity of the lung damage. By classifying the images based on illness severity we can define different scores—from healthy lung to most severe case—and produce a reliable tool to establish severity of COVID-19.

Automated segmentation and scoring of lung ultrasound images of COVID-19 patients / Roshankhah, Roshan; Karbalaeisadegh, Yasamin; Greer, Hastings; Mento, Federico; Soldati, Gino; Smargiassi, Andrea; Inchingolo, Riccardo; Torri, Elena; Aylward, Stephen; Demi, Libertario; Muller, Marie. - In: THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA. - ISSN 1520-8524. - 148:4(2020), pp. 2735-2735. (Intervento presentato al convegno ASA Meetings tenutosi a Virtual nel 2020) [10.1121/1.5147599].

Automated segmentation and scoring of lung ultrasound images of COVID-19 patients

Mento, Federico;Demi, Libertario;
2020-01-01

Abstract

Meeting abstract. No PDF available. Using ultrasound for point-of-care lung assessment is becoming more and more relevant. Today, the outbreak of the coronavirus disease 2019 (COVID-19) has spread over the world at a very high rate. The most severe cases of COVID-19 are associated with lung damage such as ground-glass opacities and areas of lung consolidation, leading to acute respiratory distress. During the COVID-19 pandemic the need to detect and monitor the lung state is critical. Changes in the COVID-19 lung structure modify the way ultrasound propagates in the lung and are reflected by changes in the appearance of lung ultrasound images. Vertical artifacts known as B-lines appear and can evolve into white lung patterns in the more severe cases. Currently, these artifacts are assessed by trained physicians and sonographers, and the diagnosis is qualitative and operator dependent. We propose an automatic segmentation method using a convolutional neural network, to automatically stage the progression of the disease and predict the severity of the lung damage. By classifying the images based on illness severity we can define different scores—from healthy lung to most severe case—and produce a reliable tool to establish severity of COVID-19.
2020
ASA Meetings
Virtual
JASA
Automated segmentation and scoring of lung ultrasound images of COVID-19 patients / Roshankhah, Roshan; Karbalaeisadegh, Yasamin; Greer, Hastings; Mento, Federico; Soldati, Gino; Smargiassi, Andrea; Inchingolo, Riccardo; Torri, Elena; Aylward, Stephen; Demi, Libertario; Muller, Marie. - In: THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA. - ISSN 1520-8524. - 148:4(2020), pp. 2735-2735. (Intervento presentato al convegno ASA Meetings tenutosi a Virtual nel 2020) [10.1121/1.5147599].
Roshankhah, Roshan; Karbalaeisadegh, Yasamin; Greer, Hastings; Mento, Federico; Soldati, Gino; Smargiassi, Andrea; Inchingolo, Riccardo; Torri, Elena;...espandi
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/283632
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact