Abstract:
Apparatuses, methods, and storage media for modifying augmented reality in response to user interaction are described. In one instance, the apparatus for modifying augmented reality may include a processor, a scene capture camera coupled with the processor to capture a physical scene, and an augmentation management module to be operated by the processor. The augmentation management module may obtain and analyze the physical scene, generate one or more virtual articles to augment a rendering of the physical scene based on a result of the analysis, track user interaction with the rendered augmented scene, and modify or complement the virtual articles in response to the tracked user interaction. Other embodiments may be described and claimed.
Abstract:
Technologies for immersive sensory experience sharing include one or more experience computing devices, an experience server, and a distance computing device. Each experience computing device captures sensor data indicative of a local sensory experience from one or more sensors and transmits the sensor data to the experience server. Sensors may include audiovisual sensors, touch sensors, and chemical sensors. The experience server analyzes the sensor data to generate combined sensory experience data and transmits the combined sensory experience data to the distance computing device. The experience server may identify one or more activities associated with the local sensory experience. The distance computing device renders a sensory experience based on the combined sensory experience data. The distance computing device may monitor a user response, generate user preferences based on the user response, and transmit the user preferences to the experience server. Other embodiments are described and claimed.
Abstract:
With a device comprising a directional antenna, obtain an interaction profile for an augmentable object and augment a sensory experience of the augmentable object according to the interaction profile.
Abstract:
Various embodiments are generally directed to techniques for providing an augmented reality view in which eye movements are employed to identify items of possible interest for which indicators are visually presented in the augmented reality view. An apparatus to present an augmented reality view includes a processor component; a presentation component for execution by the processor component to visually present images captured by a camera on a display, and to visually present an indicator identifying an item of possible interest in the captured images on the display overlying the visual presentation of the captured images; and a correlation component for execution by the processor component to track eye movement to determine a portion of the display gazed at by an eye, and to correlate the portion of the display to the item of possible interest. Other embodiments are described and claimed.
Abstract:
Technologies for adjusting a perspective of a captured image for display on a mobile computing device include capturing a first image of a user by a first camera and a second image of a real-world environment by a second camera. The mobile computing device determines a position of an eye of the user relative to the mobile computing device based on the first captured image and a distance of an object in the real-world environment from the mobile computing device based on the second captured image. The mobile computing device generates a back projection of the real-world environment captured by the second camera to the display based on the determined distance of the object in the real-world environment relative to the mobile computing device, the determined position of the user's eye relative to the mobile computing device, and at least one device parameter of the mobile computing device.
Abstract:
Computer-readable storage media, computing devices and methods are discussed herein. In embodiments, a computing device may include one or more display devices, a digital content module coupled with the one or more display devices, and an augmentation module coupled with the digital content module and the one or more display devices. The digital content module may be configured to cause a portion of textual content to be rendered on the one or more display devices. The textual content may be associated with a digital scene that may be utilized to augment the textual content. The augmentation module may be configured to dynamically adapt the digital scene, based at least in part on a real-time video feed, to be rendered on the one or more display devices to augment the textual content. Other embodiments may be described and/or claimed.
Abstract:
Apparatuses, methods, and storage media for modifying augmented reality in response to user interaction are described. In one instance, the apparatus for modifying augmented reality may include a processor, a scene capture camera coupled with the processor to capture a physical scene, and an augmentation management module to be operated by the processor. The augmentation management module may obtain and analyze the physical scene, generate one or more virtual articles to augment a rendering of the physical scene based on a result of the analysis, track user interaction with the rendered augmented scene, and modify or complement the virtual articles in response to the tracked user interaction. Other embodiments may be described and claimed.
Abstract:
This disclosure pertains to machine object determination based on human interaction. In general, a device such as a robot may be capable of interacting with a person (e.g., user) to select an object. The user may identify the target object for the device, which may determine whether the target object is known. If the device determines that target object is known, the device may confirm the target object to the user. If the device determines that the target object is not known, the device may then determine a group of characteristics for use in determining the object from potential target objects, and may select a characteristic that most substantially reduces a number of potential target objects. After the characteristic is determined, the device may formulate an inquiry to the user utilizing the characteristic. Characteristics may be selected until the device determines the target object and confirms it to the user.
Abstract:
Systems and methods may provide for receiving a short range signal from a sensor that is collocated with a short range display and using the short range signal to detect a user interaction. Additionally, a display response may be controlled with respect to a long range display based on the user interaction. In one example, the user interaction includes one or more of an eye gaze, a hand gesture, a face gesture, a head position or a voice command, that indicates one or more of a switch between the short range display and the long range display, a drag and drop operation, a highlight operation, a click operation or a typing operation.
Abstract:
With a device comprising a directional antenna, obtain an interaction profile for an augmentable object and augment a sensory experience of the augmentable object according to the interaction profile.