DETERMINING AND UTILIZING CORRECTIONS TO ROBOT ACTIONS

    公开(公告)号:US20190001489A1

    公开(公告)日:2019-01-03

    申请号:US15640914

    申请日:2017-07-03

    Abstract: Methods, apparatus, and computer-readable media for determining and utilizing human corrections to robot actions. In some implementations, in response to determining a human correction of a robot action, a correction instance is generated that includes sensor data, captured by one or more sensors of the robot, that is relevant to the corrected action. The correction instance can further include determined incorrect parameter(s) utilized in performing the robot action and/or correction information that is based on the human correction. The correction instance can be utilized to generate training example(s) for training one or model(s), such as neural network model(s), corresponding to those used in determining the incorrect parameter(s). In various implementations, the training is based on correction instances from multiple robots. After a revised version of a model is generated, the revised version can thereafter be utilized by one or more of the multiple robots.

    Generating and utilizing spatial affordances for an object in robotics applications

    公开(公告)号:US10853646B1

    公开(公告)日:2020-12-01

    申请号:US16453145

    申请日:2019-06-26

    Abstract: Methods, apparatus, systems, and computer-readable media are provided for generating spatial affordances for an object, in an environment of a robot, and utilizing the generated spatial affordances in one or more robotics applications directed to the object. Various implementations relate to applying vision data as input to a trained machine learning model, processing the vision data using the trained machine learning model to generate output defining one or more spatial affordances for an object captured by the vision data, and controlling one or more actuators of a robot based on the generated output. Various implementations additionally or alternatively relate to training such a machine learning model.

    Determining and utilizing corrections to robot actions

    公开(公告)号:US10562181B2

    公开(公告)日:2020-02-18

    申请号:US15640914

    申请日:2017-07-03

    Abstract: Methods, apparatus, and computer-readable media for determining and utilizing human corrections to robot actions. In some implementations, in response to determining a human correction of a robot action, a correction instance is generated that includes sensor data, captured by one or more sensors of the robot, that is relevant to the corrected action. The correction instance can further include determined incorrect parameter(s) utilized in performing the robot action and/or correction information that is based on the human correction. The correction instance can be utilized to generate training example(s) for training one or model(s), such as neural network model(s), corresponding to those used in determining the incorrect parameter(s). In various implementations, the training is based on correction instances from multiple robots. After a revised version of a model is generated, the revised version can thereafter be utilized by one or more of the multiple robots.

    Generating and utilizing spatial affordances for an object in robotics applications

    公开(公告)号:US10354139B1

    公开(公告)日:2019-07-16

    申请号:US15724067

    申请日:2017-10-03

    Abstract: Methods, apparatus, systems, and computer-readable media are provided for generating spatial affordances for an object, in an environment of a robot, and utilizing the generated spatial affordances in one or more robotics applications directed to the object. Various implementations relate to applying vision data as input to a trained machine learning model, processing the vision data using the trained machine learning model to generate output defining one or more spatial affordances for an object captured by the vision data, and controlling one or more actuators of a robot based on the generated output. Various implementations additionally or alternatively relate to training such a machine learning model.

Patent Agency Ranking