PANOPTIC SEGMENTATION FORECASTING FOR AUGMENTED REALITY
摘要:
Panoptic segmentation forecasting predicts future positions of foreground objects and background objects separately. An egomotion model may be implemented to estimate egomotion of the camera. Pixels in frames of captured video are classified between foreground and background. The foreground pixels are grouped into foreground objects. A foreground motion model forecasts motion of the foreground objects to a future timestamp. A background motion model backprojects the background pixels into point clouds in a three-dimensional space. The background motion model predicts future positions of the point clouds based on egomotion. The background motion model may further generate novel point clouds to fill in occluded space. With the predicted future positions, the foreground objects and the background pixels are combined into a single panoptic segmentation forecast. An augmented reality mobile game may utilize the panoptic segmentation forecast to accurately portray movement of virtual elements in relation to the real-world environment.
信息查询
0/0