Motion recognition based on weighted three-view motion history image coupled time segmentation
DOI:
Author:
Affiliation:

Clc Number:

TP391. 41; TN06

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Aiming at the problem of poor recognition accuracy caused by human trunk occlusion in current human motion recognition algorithms, an action recognition algorithm based on weighted three-view motion history image coupling time series segmentation was proposed. Firstly, in order to effectively describe the shape and spatial distribution of the action, motion history image ( MHI) is extracted from the video sequence. Subsequently, the Kinect camera was used to extract the depth image to obtain the outline of the human target's action foreground. In order to recognize the self-occlusion caused by body parts, the outline of action foreground was projected to three view angles (3V) planes to form 3V-MHI, which enhances the correct extraction of action. Using 3V-MHI, a MII for recording and observing trajectories was constructed, which overcomes the information limitation of single-view MHI. Then, according to the adjacent 3V-MHI, the energy and direction of motion are calculated by using temporal segmentation to detect the beginning and end of motion and output the result of motion. In addition, the gradient value of MHI was calculated as the weight corresponding to each plane, and the weighted 3V-MHI was obtained. Finally, the extracted histogram motion template was compared with the pre-established database to complete the action classification and recognition. Experiments show that the method can effectively solve the problem of selfocclusion and has high accuracy and robustness for motion recognition under complex background and illumination changes.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: November 20,2023
  • Published: