Breast Cancer Image Classification Using Phase Features and Deep Ensemble Models
Breast cancer is a leading cause of mortality among women worldwide. Early detection is crucial for increasing patient survival rates. Artificial intelligence, particularly convolutional neural networks (CNNs), has enabled the development of effective diagnostic systems by digitally processing mammo...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-07-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/14/7879 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Breast cancer is a leading cause of mortality among women worldwide. Early detection is crucial for increasing patient survival rates. Artificial intelligence, particularly convolutional neural networks (CNNs), has enabled the development of effective diagnostic systems by digitally processing mammograms. CNNs have been widely used for the classification of breast cancer in images, obtaining accurate results similar in many cases to those of medical specialists. This work presents a hybrid feature extraction approach for breast cancer detection that employs variants of EfficientNetV2 network and convenient image representation based on phase features. First, a region of interest (ROI) is extracted from the mammogram. Next, a three-channel image is created using the local phase, amplitude, and orientation features of the ROI. A feature vector is constructed for the processed mammogram using the developed CNN model. The size of the feature vector is reduced using simple statistics, achieving a redundancy suppression of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>99.65</mn><mo>%</mo></mrow></semantics></math></inline-formula>. The reduced feature vector is classified as either malignant or benign using a classifier ensemble. Experimental results using a training/testing ratio of 70/30 on 15,506 mammography images from three datasets produced an accuracy of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>86.28</mn><mo>%</mo></mrow></semantics></math></inline-formula>, a precision of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>78.75</mn><mo>%</mo></mrow></semantics></math></inline-formula>, a recall of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>86.14</mn><mo>%</mo></mrow></semantics></math></inline-formula>, and an F1-score of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>80.09</mn><mo>%</mo></mrow></semantics></math></inline-formula> with the modified EfficientNetV2 model and stacking classifier. However, an accuracy of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>93.47</mn><mo>%</mo></mrow></semantics></math></inline-formula>, a precision of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>87.61</mn><mo>%</mo></mrow></semantics></math></inline-formula>, a recall of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>93.19</mn><mo>%</mo></mrow></semantics></math></inline-formula>, and an F1-score of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>90.32</mn><mo>%</mo></mrow></semantics></math></inline-formula> were obtained using only CSAW-M dataset images. |
---|---|
ISSN: | 2076-3417 |