UAST-RCNN:遮挡行人的目标检测算法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP391;TP181;TN99

基金项目:

国家自然科学基金(61876131)、天津市教委科研计划项目(2016CJ12)资助


UAST-RCNN: Object detection algorithm for blocking pedestrians
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为解决行人检测中对遮挡行人的检测度低,漏检率较高的问题,提出一种基于注意力机制的 UAST-RCNN 网络,其在 Faster-RCNN 网络的基础上进行改进。 首先,选用 Swin-Transformer 作为骨干网络,通过采用一种窗口多头自注意力机制提升全 局感受野;然后,通过层级重采样模块,改进特征金字塔提升特征样本的质量,并且引入渐进式焦点损失函数平衡正负样本;最 后,在实验预处理阶段采用改进的数据预处理扩充 City Persons 数据集进行多尺度训练。 实验结果表明该算法对比原模型在遮 挡行人检测上有了明显提升,其中在检测精度(AP)提升了 6. 3%,漏检率(MR)下降了 4. 1%。 验证了所提算法在行人检测的 可行性,可满足遮挡行人场景的检测要求。

    Abstract:

    In order to solve the problem of low detection degree and high missed detection rate of occluded pedestrians in pedestrian detection, an attention mechanism based UAST-RCNN network is proposed, which is improved on the basis of Faster-RCNN network. Firstly, Swin-Transformer is selected as the backbone network to improve the global receptive field by using a window multi-head selfattention mechanism. Then, the feature pyramid is improved for the quality of feature samples through the hierarchical resampling module, and a progressive focus loss function is introduced to balance positive and negative samples. Finally, in the preprocessing stage of the experiment, the improved data preprocessing was used to extend the City Persons dataset for multi-scale training. The experimental results show that the algorithm has a significant improvement in the detection of occluded pedestrians compared with the original model, in which the AP is increased by 6. 3%, and the MR (miss rate) is decreased by 4. 1%. The feasibility of the proposed algorithm in pedestrian detection is verified, and it can meet the detection requirements of occluded pedestrian scene.

    参考文献
    相似文献
    引证文献
引用本文

刘 毅,于畅洋,李国燕,潘玉恒. UAST-RCNN:遮挡行人的目标检测算法[J].电子测量与仪器学报,2022,36(12):168-175

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-03-29
  • 出版日期: