• ACB-Net:带有注意力补偿分支的图像分割算法及应用

    ACB-Net: Advanced segmentation with attention compensation branches

    • 近年来,基于深度学习的图像分割技术取得了显著进展,但现有方法在特征提取方面存在深浅层特征区分不充分的问题,影响了分割效果的准确性。为解决此问题,提出了一种称为ACB-Net的创新分割网络,通过引入注意力补偿机制提升分割性能。具体而言,ACB-Net引入ResPath来处理低级特征中的语义信息不足,采用卷积补偿的CC-Transformer捕捉深层特征的全局上下文信息,并通过双增强卷积(DEC-Block)促进浅层与深层特征的有效融合。该方法实现了CNN对局部边缘信息的强化与注意力机制对全局信息的补偿,提升了模型对不同尺度特征的表征能力。在公开皮肤病数据集(ISIC 2017和ISIC 2018)和息肉数据集(Kvasir-SEG、CVC-ColonDB、CVC-ClinicDB、ETIS-LaribPolypDB)上的实验结果表明:ACB-Net在多种指标上显著优于现有方法。这一特征差异化处理与互补机制为图像自动分割提供了精确且高效的解决方案,对生物医学图像分析中的深度学习应用具有重要的借鉴意义。

       

      Abstract: In recent years, image segmentation technology based on deep learning has made significant progress. However, existing methods suffer from insufficient differentiation between deep and shallow features during feature extraction, which affects the accuracy of segmentation results. To address this issue, an innovative segmentation network called ACB-Net is proposed, which improves segmentation performance through an attention compensation mechanism. Specifically, ACB-Net introduces ResPath to handle the lack of semantic information in low-level features, employs a convolution-compensated CC-Transformer to capture the global context information of deep features, and facilitates the effective fusion of shallow and deep features. This approach enhances local edge information through CNNs and compensates for global information via the attention mechanism, thereby improving the model's ability to represent features at different scales. Experimental results on public dermatology datasets (ISIC 2017 and ISIC 2018) and polyp datasets (Kvasir-SEG, CVC-ColonDB, CVC-ClinicDB, ETIS-LaribPolypDB) demonstrate that ACB-Net significantly outperforms existing methods across multiple metrics. This feature differentiation and complementary mechanism provides an accurate and efficient solution for automatic image segmentation and offers significant reference value for deep learning applications in biomedical image analysis.

       

    /

    返回文章
    返回