Abstract:
Systems and methods for proximity detection between electronic devices are disclosed. One or more electronic devices transmit signals to a proximity server, which determines whether the first electronic device may be proximate the second electronic device. The proximity server transmits a signal to the first electronic device and the second device, and in response to the signal, the first and second electronic devices activate an environmental sensor, collect at least one sample of environmental data, extract at least one feature set of the environmental data, generate a first obscured feature from the feature set, transmit the first and second obscured feature sets to the proximity server. The proximity server uses the first obscured feature set and the second obscured feature set to determine whether the first electronic device and the second electronic device are proximate.
Abstract:
Apparatus and methods may provide for an interactive display projection with surface interactivity analysis. An interactive display projector is provided along with one or more of a camera or an electromagnetic radiation source to scan plural surfaces within a projection range of the interactive display projector. Logic, implemented at least partly in configurable or fixed functionality hardware may process reflected electromagnetic radiation to determine one or more of size, distance, texture, reflectivity, or angle with respect to the interactive display projector of the scanned plural surfaces and determine, based on the processing, interactivity of one or more of the plural surfaces for an interactive display.
Abstract:
Systems, apparatuses and methods may leverage technology that identifies sensor data, automatically determines a change in an affective state of one or more individuals based at least in part on the sensor data and conducts an update to a game score based on the change in the affective state. In one example, the game score is associated with a player and the one or more individuals are individuals other than the player.
Abstract:
A portable consumer device, such as a mobile phone or a tablet computer, may be configured to collect measurements data associated with at least one of movement of the portable consumer device or orientation of the portable consumer device, as well as data associated with an external environment of the portable consumer device. The collected data may be evaluated in order to determine motion of the portable consumer device over time and contextual information associated with the portable consumer device. A user activity may be determined based upon the determined motion and the determined contextual information. As desired, the user activity may be evaluated in association with a suitable application scenario, such as a gaming application scenario.
Abstract:
System and techniques for tracking caloric expenditure using sensor driven fingerprints are described herein. A set of outputs may be obtained from a plurality of sensors. A fingerprint may be generated using the set of outputs. The fingerprint may correspond to an activity observed by the plurality of sensors. The generated fingerprint may be compared to a set of fingerprints stored in a database. Each fingerprint of the set of fingerprints may correspond to a respective caloric expenditure. A caloric expenditure may be calculated for the activity based on the comparison. An exercise profile of a user may be updated using the caloric expenditure
Abstract:
Embodiments may comprise logic such as hardware and/or code to map content of a device such as a mobile device, a laptop, a desktop, or a server, to a two dimensional field or table and map user poses or movements to the coordinates within the table to offer quick access to the content by a user. Many embodiments, for example, utilize three wireless peripherals such as a watch, ring, and headset connected to a mobile Internet device (MID) comprising an audible user interface and an auditory mapper to access to the content. The audible user interface may communicatively couple with the peripherals to receive pose data that describes the motion or movements associated with one or more of the peripherals and to provide feedback such as audible items and, in some embodiments, other feedback.
Abstract:
A system and a method for providing a shared augmented reality presentation are disclosed. A group presentation server communicates with one or more wearable computing devices. The group presentation server coordinates the outputs of the various wearable computing devices to present a shared augmented reality presentation to members of group, where every member of the group experiences a unique perspective on the presentation.
Abstract:
A data processing system includes components for providing a pleasant user experience. Those components may include a family interaction engine that provides a family channel. The family interaction engine may provide for creation of a user group. The family channel may present content of interest to multiple users in the user group. When a user is detected near the data processing system, the family interaction engine may automatically present content of interest to that user. When used for presenting media content, the data processing system may also cause supplemental data to automatically be presented, wherein the supplemental data is relevant to the media content and to a predetermined interest of the user. The data processing system may also provide a ranked list of applications for potential activation by the user. The applications may be ordered based on the current context. Other embodiments are described and claimed.
Abstract:
Techniques to project an image from a wearable computing device are provided. A wearable computing device including a projector configured to project an image into a user field of view based on output from one or more sensors and/or images captured by a camera. The wearable computing device can also include a touch input device. The wearable computing device can project an image responsive to a users touch based on signals received from the touch input device.
Abstract:
Systems and methods may provide for a programmable array of tactile elements in which the active elements may be dynamically altered in time and space and in dependence upon the orientation of the device of which it is a part. That device may be part of a wearable device, such as a hat, smart watch, smart glasses, glove, wristband or other garment.