联合多注意力和C-ASPP的单目3D目标检测
DOI:
作者:
作者单位:

西安工程大学

作者简介:

通讯作者:

中图分类号:

基金项目:

陕西省科技厅项目(2018GY-173)、西安市科技局项目(GXYD7. 5)


Combined multi-attention and C-ASPP Network for Monocular 3D Object Detection
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对单目3D检测中网络结构复杂?深度估计后得到的目标深度信息不精确的问题,本文提出一种端到端的联合多注意力深度估计的单目3D目标检测网络结构(CDCN-3D)?首先,为获取目标显著特征,引入自适应空间注意力机制,对像素特征进行聚集,以增强局部特征来提升网络表征能力;其次,为改善深度估计时局部信息丢失问题,利用改进C-ASPP使每个深度信息都能够捕获更加精确的方向感知和位置敏感信息;最后,利用精确的P-BEV将得到的目标三维信息映射到二维平面,再用单级目标检测器完成检测输出任务?实验结果证明,CDCN-3D网络在KITTI数据集上,其准确率优于其他单目3D检测网络,在Car?Pedestrian?Cyclist类中,其检测精确度分别提升2.31%?1.48%?1.14%,能够完成3D目标检测任务?

    Abstract:

    Aiming at the problems of complex network structure and inaccurate target depth information obtained after depth estimation in monocular 3D detection, this paper proposes an end-to-end joint multi-attention depth estimation monocular 3D target detection network structure (CDCN-3D). Firstly, in order to obtain the salient features of the target, the adaptive spatial attention mechanism is introduced to aggregate the pixel features to enhance the local features and improve the network representation ability; Secondly, in order to improve the problem of local information loss in depth estimation, the improved C-ASPP is used to capture more accurate direction perception and position sensitive information for each depth information. Finally, the accurate P-BEV is used to map the three-dimensional information of the target to a two-dimensional plane, and then the single-stage target detector is used to complete the detection and output task. The experimental results show that the accuracy of CDCN-3D network is better than other monocular 3D detection networks on KITTI data set. In Car, Pedestrian and Cyclist classes, its detection accuracy is increased by 2.31%, 1.48% and 1.14% respectively, and it can complete the 3D target detection task.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-03-23
  • 最后修改日期:2023-05-30
  • 录用日期:2023-06-02
  • 在线发布日期:
  • 出版日期: