A Method for Identifying Cervical Abnormal Cells Based on Sample Benchmark Values
The identification of cervical abnormal cells using deep learning methods usually requires a large amount of training data, but these data inevitably use different samples of cervical abnormal cells to participate in model training, and naturally miss the positive and abnormal intracellular controls...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | Chinese |
Published: |
Harbin University of Science and Technology Publications
2022-12-01
|
Series: | Journal of Harbin University of Science and Technology |
Subjects: | |
Online Access: | https://hlgxb.hrbust.edu.cn/#/digest?ArticleID=2164 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The identification of cervical abnormal cells using deep learning methods usually requires a large amount of training data, but these data inevitably use different samples of cervical abnormal cells to participate in model training, and naturally miss the positive and abnormal intracellular controls of a single sample, resulting in the fact that recognition accuracy of cervical abnormal cells is not high, and the false positive rate is high. To solve this problem, this paper proposes a method for identifying cervical abnormal cells based on sample benchmark values. Firstly, this method identifies cervical cells and segments cervical cell nucleus by Mask R-CNN model. Then we calculate the key cervical nucleus indicators, propose the concept of benchmark cells, define the sample benchmark values, and quantify the diagnostic criteria.Finally, the abnormal nucleus indicator and model information are used to complete the reclassification of abnormal cervical cells, and the abnormal cells were identified by simulating a doctor comparing the morphology of the normal cells in the sample. Experiments show that the positive cell completion rate, positive cell detection accuracy and sample detection accuracy rate on the cervical cell smear dataset reached 84.7%, 94.6% and 92.4%, respectively. |
---|---|
ISSN: | 1007-2683 |