In this paper, we present a work on breast density classification performed with deep residual neural network and we discuss the future analysis we could perform. Breast density is one of the most important breast cancer risk factor and it represents the amount of fibroglandular tissue with respect to fat tissue as seen on a mammographic exam. However, it is not easy to include it in risk models because of its variability among women and its qualitative definition. We trained a deep CNN to perform breast density classification in two ways. First, we classified mammograms using two “super-classes” that are dense and non-dense breast. Second, we trained the residual neural network to classify mammograms according to the four classes of the BI-RADS standard. We obtained very good results compared to our literature knowledge in terms of accuracy and recall. In the near future, we are going to improve the robustness of our algorithm with respect to the mammographic systems used and we want to include pathological exams too. Then we want to study and characterize the CNN-extracted features in order to identify the most significant for breast density. Finally, we want to study how to quantitatively measure the precision of the network in capturing the significative part of the images.

Residual convolutional neural networks to automatically extract significant breast density features / Lizzi, F.; Laruina, F.; Oliva, P.; Retico, A.; Fantacci, M. E.. - 1089:(2019), pp. 28-35. (Intervento presentato al convegno 1st Workshop on Deep-learning based Computer Vision for UAV, DL-UAV 2019, and 1st Workshop on Visual Computing and Machine Learning for Biomedical Applications, ViMaBi 2019 held at the 18th International Conference on Computer Analysis of Images and Patterns, CAIP 2019 tenutosi a ita nel 2019) [10.1007/978-3-030-29930-9_3].

Residual convolutional neural networks to automatically extract significant breast density features

Oliva P.;
2019-01-01

Abstract

In this paper, we present a work on breast density classification performed with deep residual neural network and we discuss the future analysis we could perform. Breast density is one of the most important breast cancer risk factor and it represents the amount of fibroglandular tissue with respect to fat tissue as seen on a mammographic exam. However, it is not easy to include it in risk models because of its variability among women and its qualitative definition. We trained a deep CNN to perform breast density classification in two ways. First, we classified mammograms using two “super-classes” that are dense and non-dense breast. Second, we trained the residual neural network to classify mammograms according to the four classes of the BI-RADS standard. We obtained very good results compared to our literature knowledge in terms of accuracy and recall. In the near future, we are going to improve the robustness of our algorithm with respect to the mammographic systems used and we want to include pathological exams too. Then we want to study and characterize the CNN-extracted features in order to identify the most significant for breast density. Finally, we want to study how to quantitatively measure the precision of the network in capturing the significative part of the images.
2019
978-3-030-29929-3
978-3-030-29930-9
Residual convolutional neural networks to automatically extract significant breast density features / Lizzi, F.; Laruina, F.; Oliva, P.; Retico, A.; Fantacci, M. E.. - 1089:(2019), pp. 28-35. (Intervento presentato al convegno 1st Workshop on Deep-learning based Computer Vision for UAV, DL-UAV 2019, and 1st Workshop on Visual Computing and Machine Learning for Biomedical Applications, ViMaBi 2019 held at the 18th International Conference on Computer Analysis of Images and Patterns, CAIP 2019 tenutosi a ita nel 2019) [10.1007/978-3-030-29930-9_3].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11388/231493
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 5
social impact