BI-RADS density categorization using deep neural networks

  • Ziba Gandomkar
  • , Moayyad E. Suleiman
  • , Delgermaa Demchig
  • , Patrick C. Brennan
  • , Mark F. McEntee

Research output: Chapter in Book/Report/Conference proceedingsConference proceedingpeer-review

Abstract

The Breast Imaging and Reporting Data System (BI-RADS) density score is a qualitative measure and thus subject to inter- and intra-radiologist variability. In this study we investigated the possibility of fine-tuning a state-of-the-art deep neural networks for (i) distinguishing fatty breasts (BI-RADS I and II) from dense ones (BI-RADS III and IV), (ii) classifying the low risk group into BIRADS I and II, and (iii) classifying the high risk group into BIRADS III and IV. To do so 3813 images acquired from nine mammography units and three manufacturers were used to train an Inception- V3 network architecture. The network was pre-trained on the ImageNet data set and we trained it on our dataset using transfer learning. Before feeding the images into the input layer of Inception- V3, the breast tissue was segmented from the background and the pectoral muscle was excluded from the image in the mediolateral oblique view. Images were then cropped by using the breast bounding box and resized to make the images compatible with the input layer of the network. The performance of the network was evaluated on a blinded test set of 150 mammograms acquired from 14 mammography units provided by six manufacturers. The reference density value for these images was obtained based on the consensus of three radiologists. The network achieved an accuracy of 92.0% in high versus low risk classification. For the second and third classification tasks, the overall accuracy was 85.9% and 86.1%. When results from all three classifications combined, the networks achieved an accuracy of 83.33% and a Cohen's kappa of 0.775 (95% CI: 0.694-0.856) for four-point density categorization. The obtained results suggest that a deep learning-based computerized tool can be used for providing BI-RADS density scores.

Original languageEnglish
Title of host publicationMedical Imaging 2019
Subtitle of host publicationImage Perception, Observer Performance, and Technology Assessment
EditorsRobert M. Nishikawa, Frank W. Samuelson
PublisherSPIE
ISBN (Electronic)9781510625518
DOIs
Publication statusPublished - 2019
Externally publishedYes
EventMedical Imaging 2019: Image Perception, Observer Performance, and Technology Assessment - San Diego, United States
Duration: 20 Feb 201921 Feb 2019

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume10952
ISSN (Print)1605-7422

Conference

ConferenceMedical Imaging 2019: Image Perception, Observer Performance, and Technology Assessment
Country/TerritoryUnited States
CitySan Diego
Period20/02/1921/02/19

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • Breast cancer
  • Breast cancer risk
  • GIST
  • Mammography
  • Prior mammograms

Fingerprint

Dive into the research topics of 'BI-RADS density categorization using deep neural networks'. Together they form a unique fingerprint.

Cite this