Abstract:
In one implementation, a computer program product can be tangibly embodied on a non-transitory computer-readable storage medium and include instructions that, when executed, are configured to detect a gesture defined by an interaction of a user within a working volume defined above a surface. Based on the detected gesture, a gesture cursor control mode can be initiated within the computing device such that the user can manipulate the cursor by moving a portion of the hand of the user within the working volume. A location of the portion of the hand of the user relative to the surface can be identified within the working volume and a cursor can be positioned within a display portion of the computing device at a location corresponding to the identified location of the portion of the hand of the user within the working volume.
Abstract:
A method to provide simultaneous interaction with content while not disturbing the content being provided is disclosed. Content may be provided to a group of users. At least one of the users may make a gesture. The gesture may be associated with a user identifier and with a content identifier. An event may be stored based on the gesture from the at least one of the users, the user identifier, and the content identifier. The event may be selected from the group consisting of: a vote, a purchase decision, a modification of content, an adjustment of a device setting, or a bookmark. A notice may be provided to the at least one user to indicate that the action requested by the gesture was performed.
Abstract:
Aspects of the present disclosure relate to controlling the functions of various devices based on spatial relationships. In one example, a system may include a depth and visual camera and a computer (networked or local) for processing data from the camera. The computer may be connected (wired or wirelessly) to any number of devices that can be controlled by the system. A user may use a mobile device to define a location in space relative to the camera. The location in space may then be associated with a controlled device as well as one or more control commands. When the location in space is subsequently occupied, the one or more control commands may be used to control the controlled device. In this regard, a user may switch a device on or off, increase volume or speed, etc. simply by occupying the location in space.
Abstract:
A function of a device, such as volume, may be controlled using a combination of gesture recognition and an interpolation scheme. Distance between two objects such as a user's hands may be determined, at a first time point and a second time point. The difference between the distances calculated at two time points may be mapped onto a plot of determined difference versus a value of the function to set the function of a device to the mapped value.
Abstract:
A privacy indicator is provided that shows whether sensor data are being processed in a private or non-private mode. When sensor data are used only for controlling a device locally, it may be in a private mode, which may be shown by setting the privacy indicator to a first color. When sensor data are being sent to a remote site, it may be in a non-private mode, which may be shown by setting the privacy indicator to a second color. The privacy mode may be determined by processing a command in accordance with a privacy policy of determining if the command is on a privacy whitelist, blacklist, greylist or is not present in a privacy command library. A non-private command may be blocked.
Abstract:
In one general aspect, a method can include receiving, by a first computing device from a virtual reality (VR) headset, data indicative of a position of a second computing device, rendering, by the first computing device, an aspect of the second computing device for inclusion in a VR space based on the position of the second computing device, and integrating the rendered aspect of the second computing device with content for display as integrated content in the VR space. The method can further include providing the integrated content to the VR headset for display on a screen included in the VR headset, receiving data indicative of an interaction of a user with the second computing device, and based on the received data indicative of the interaction of the user with the second computing device, altering the content for display as integrated content in the VR space.
Abstract:
Systems, methods, and media for causing an action to be performed on a user device are provided. In some implementations, the systems comprise: a first user device comprising at least one hardware processor that is configured to: detect a second user device in proximity to the first user device; receive a user input indicative of an action to be performed; determine a plurality of candidate devices that are capable of performing the action, wherein the plurality of candidate devices includes the second user device; determine a plurality of device types corresponding to the plurality of candidate devices; determine a plurality of priorities associated with the plurality of candidate devices based at least in part on the plurality of device types; select a target device from the plurality of candidate devices based at least in part on the plurality of priorities; and cause the action to be performed by the target device.
Abstract:
Described is a system and technique for providing the ability for a user to interact with one or more devices by performing gestures that mimic real-world physical analogies. More specifically, the techniques described herein provide the ability for a user to interact with a device by limiting the conscious gesturing for a computer component by camouflaging computer-recognizable gestures within manipulations of a physical objects.
Abstract:
A privacy indicator is provided that shows whether sensor data are being processed in a private or non-private mode. When sensor data are used only for controlling a device locally, it may be in a private mode, which may be shown by setting the privacy indicator to a first color. When sensor data are being sent to a remote site, it may be in a non-private mode, which may be shown by setting the privacy indicator to a second color. The privacy mode may be determined by processing a command in accordance with a privacy policy of determining if the command is on a privacy whitelist, blacklist, greylist or is not present in a privacy command library. A non-private command may be blocked.