Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the Objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
Abstract:
Various embodiments of a wirelessly powered local computing environment are described. The wireless powered local computing environment includes at least a near field magnetic resonance (NFMR) power supply arranged to wirelessly provide power to any of a number of suitably configured devices. In the described embodiments, the devices arranged to receive power wirelessly from the NFMR power supply must be located in a region known as the near field that extends no further than a distance D of a few times a characteristic size of the NFMR power supply transmission device. Typically, the distance D can be on the order of 1 meter or so.
Abstract:
Various embodiments of a wirelessly powered local computing environment are described. The wireless powered local computing environment includes at least a near field magnetic resonance (NFMR) power supply arranged to wirelessly provide power to any of a number of suitably configured devices. In the described embodiments, the devices arranged to receive power wirelessly from the NFMR power supply must be located in a region known as the near field that extends no further than a distance D of a few times a characteristic size of the NFMR power supply transmission device. Typically, the distance D can be on the order of 1 meter or so.
Abstract:
Various embodiments of a wirelessly powered local computing environment are described. The wireless powered local computing environment includes at least a near field magnetic resonance (NFMR) power supply arranged to wirelessly provide power to any of a number of suitably configured devices. In the described embodiments, the devices arranged to receive power wirelessly from the NFMR power supply must be located in a region known as the near field that extends no further than a distance D of a few times a characteristic size of the NFMR power supply transmission device. Typically, the distance D can be on the order of 1 meter or so.