-
公开(公告)号:US11657527B2
公开(公告)日:2023-05-23
申请号:US16424363
申请日:2019-05-28
Applicant: X Development LLC
Inventor: Yunfei Bai , Yuanzheng Gong
IPC: H04N5/268 , H04N5/247 , H04N5/232 , H04N13/111 , H04N13/161 , H04N13/282 , B25J9/16 , G06T7/12 , G06T7/521 , G06T7/13 , G06K9/62 , G06T5/00
CPC classification number: B25J9/1697 , B25J9/1661 , G06K9/6267 , G06T5/008 , G06T7/12 , G06T7/13 , G06T7/521 , H04N5/247 , G06T2207/10024 , G06T2207/10028 , G06T2207/10048
Abstract: Generating edge-depth values for an object, utilizing the edge-depth values in generating a 3D point cloud for the object, and utilizing the generated 3D point cloud for generating a 3D bounding shape (e.g., 3D bounding box) for the object. Edge-depth values for an object are depth values that are determined from frame(s) of vision data (e.g., left/right images) that captures the object, and that are determined to correspond to an edge of the object (an edge from the perspective of frame(s) of vision data). Techniques that utilize edge-depth values for an object (exclusively, or in combination with other depth values for the object) in generating 3D bounding shapes can enable accurate 3D bounding shapes to be generated for partially or fully transparent objects. Such increased accuracy 3D bounding shapes directly improve performance of a robot that utilizes the 3D bounding shapes in performing various tasks.
-
2.
公开(公告)号:US20200376675A1
公开(公告)日:2020-12-03
申请号:US16424363
申请日:2019-05-28
Applicant: X Development LLC
Inventor: Yunfei Bai , Yuanzheng Gong
Abstract: Generating edge-depth values for an object, utilizing the edge-depth values in generating a 3D point cloud for the object, and utilizing the generated 3D point cloud for generating a 3D bounding shape (e.g., 3D bounding box) for the object. Edge-depth values for an object are depth values that are determined from frame(s) of vision data (e.g., left/right images) that captures the object, and that are determined to correspond to an edge of the object (an edge from the perspective of frame(s) of vision data). Techniques that utilize edge-depth values for an object (exclusively, or in combination with other depth values for the object) in generating 3D bounding shapes can enable accurate 3D bounding shapes to be generated for partially or fully transparent objects. Such increased accuracy 3D bounding shapes directly improve performance of a robot that utilizes the 3D bounding shapes in performing various tasks.
-
公开(公告)号:US10592552B1
公开(公告)日:2020-03-17
申请号:US16374836
申请日:2019-04-04
Applicant: X Development LLC
Inventor: Emily Cooper , David Deephanphongs , Yuanzheng Gong , Thomas Buschmann , Matthieu Guilbert
Abstract: Methods, apparatus, systems, and computer-readable media for assigning a real-time clock domain timestamp to sensor frames from a sensor component that operates in a non-real-time time-domain. In some implementations, a real-time component receives capture instances that each indicate capturing of a corresponding sensor data frame by the sensor component. In response to a capture output instance, the real-time component or an additional real-time component assigns a real-time timestamp to the capture output instance, where the real-time timestamp is based on the real-time clock domain. Separately, a non-real-time component receives the corresponding sensor data frames captured by the sensor component, along with corresponding metadata. For each sensor data frame, it is determined whether there is a real-time timestamp that corresponds to the data frame and, if so, the real-time timestamp is assigned to the sensor data frame.
-
公开(公告)号:US10296602B1
公开(公告)日:2019-05-21
申请号:US15490711
申请日:2017-04-18
Applicant: X Development LLC
Inventor: Emily Cooper , Matthieu Guilbert , Thomas Buschmann , David Deephanphongs , Yuanzheng Gong
Abstract: Methods, apparatus, systems, and computer-readable media for assigning a real-time clock domain timestamp to sensor frames from a sensor component that operates in a non-real-time time-domain. In some implementations, a real-time component receives capture instances that each indicate capturing of a corresponding sensor data frame by the sensor component. In response to a capture output instance, the real-time component or an additional real-time component assigns a real-time timestamp to the capture output instance, where the real-time timestamp is based on the real-time clock domain. Separately, a non-real-time component receives the corresponding sensor data frames captured by the sensor component, along with corresponding metadata. For each sensor data frame, it is determined whether there is a real-time timestamp that corresponds to the data frame and, if so, the real-time timestamp is assigned to the sensor data frame.
-
-
-