Unsupervised monocular depth estimation based on stable photometric loss
DOI:
CSTR:
Author:
Affiliation:

Key Laboratory of Advanced Process Control for Light Industry (Ministry of Education), Jiangnan University, Wuxi 214122, China

Clc Number:

TN911.73

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The photometric loss has been playing an important role in the training of video-based unsupervised monocular depth estimation models. However, it generally has large errors in special regions such as weak texture regions and edge regions, which leads to strong instability in the supervision signal of the training network. To solve the problem, a more robust unsupervised monocular depth estimation method is proposed. The method first combines the dual-branch encoder and the channel attention module to improve the extraction ability of the single-frame depth network for depth features. Then, the single-frame depth network results are used to guide the multi-frame depth estimation to improve the accuracy of depth estimation. On the basis, a new photometric loss function is designed. By calculating the photometric loss on the image gradient, the unreasonable supervision caused by local brightness changes is eliminated. At the same time, the difference between successive pixels is used to define the blurry pixels. Finally, the false supervision caused by the blurred pixels on the target frame and the reconstructed target frame is excluded based on the binary mask. In the test results of the KITTI dataset, multiple indicators such as the average relative error, the square relative error and the root mean square error have improved. The average relative error and the squared relative error are reduced to 0.075 and 0.548 respectively. The experimental result shows that the proposed method further improves the performance of existing models compared with other advanced methods.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: January 13,2025
  • Published:
Article QR Code