VIDEO FRAME PROCESSING FOR MOTION COMPARISON

    公开(公告)号:EP3376437A1

    公开(公告)日:2018-09-19

    申请号:EP18164976.5

    申请日:2017-07-13

    IPC分类号: G06K9/00 G06T7/269 G06K9/62

    摘要: A method is disclosed of processing a sequence of video frames showing motion of a subject to compare the motion of the subject with a reference motion. The method comprises storing at least one reference motion data frame defining a reference motion, each reference motion data frame corresponding to respective first and second reference video frames in a sequence of video frames showing the reference motion and comprising a plurality of optical flow vectors, each optical flow vector corresponding to a respective area segment defined in the first reference video frame and a corresponding area segment defined in the second reference video frame and defining optical flow between the area segment defined in the first reference video frame and the area segment defined in the second reference video frame. The method further comprises receiving a sequence of video frames to be processed. The method further comprises processing at least one pair of the received video frames to generate a motion data frame defining motion of a subject between the pair of received video frames. Each pair of received video frames that is processed is processed by, for each area segment of the reference video frames, determining a corresponding area segment in a first video frame of the pair and a corresponding area segment in a second video frame of the pair. Each of the pairs of received video frames is further processed by, for each determined pair of corresponding area segments, comparing the area segments and generating an optical flow vector defining optical flow between the area segments. Each of the pairs of received video frames is further processed by generating a motion data frame for the pair of received video frames, the motion data frame comprising the optical flow vectors generated for the determined pairs of corresponding area segments. The method further comprises comparing the at least one reference motion data frame defining the reference motion to the at least one generated motion data frames defining the motion of the subject and generating a similarity metric for the motion of the subject and the reference motion.

    VIDEO FRAME PROCESSING FOR MOTION COMPARISON

    公开(公告)号:EP3376436A1

    公开(公告)日:2018-09-19

    申请号:EP18164970.8

    申请日:2017-07-13

    IPC分类号: G06K9/00 G06T7/269 G06K9/62

    摘要: A method is disclosed of processing a sequence of video frames showing motion of a subject to compare the motion of the subject with a reference motion. The method comprises storing at least one reference motion data frame defining a reference motion, each reference motion data frame corresponding to respective first and second reference video frames in a sequence of video frames showing the reference motion and comprising a plurality of optical flow vectors, each optical flow vector corresponding to a respective area segment defined in the first reference video frame and a corresponding area segment defined in the second reference video frame and defining optical flow between the area segment defined in the first reference video frame and the area segment defined in the second reference video frame. The method further comprises receiving a sequence of video frames to be processed. The method further comprises processing at least one pair of the received video frames to generate a motion data frame defining motion of a subject between the pair of received video frames. Each pair of received video frames that is processed is processed by, for each area segment of the reference video frames, determining a corresponding area segment in a first video frame of the pair and a corresponding area segment in a second video frame of the pair. Each of the pairs of received video frames is further processed by, for each determined pair of corresponding area segments, comparing the area segments and generating an optical flow vector defining optical flow between the area segments. Each of the pairs of received video frames is further processed by generating a motion data frame for the pair of received video frames, the motion data frame comprising the optical flow vectors generated for the determined pairs of corresponding area segments. The method further comprises comparing the at least one reference motion data frame defining the reference motion to the at least one generated motion data frames defining the motion of the subject and generating a similarity metric for the motion of the subject and the reference motion.