High Fidelity Canonical Texture Mapping from Single-View Images

    公开(公告)号:US20240428500A1

    公开(公告)日:2024-12-26

    申请号:US18338060

    申请日:2023-06-20

    Applicant: Google LLC

    Abstract: Provided are systems and methods for creating 3D representations from one or more images of objects. It involves training a machine-learned correspondence network to convert 3D locations of pixels into a 2D canonical coordinate space. This network can map texture values from ground truth or synthetic images of the object into the 2D space, creating a texture data set. When a new synthetic image is generated from a specific pose, the 3D locations can be mapped into the 2D space, allowing texture values to be retrieved and applied to the new image. The system also enables users to edit the texture data, facilitating texture edits and transfers across objects.

Patent Agency Ranking