Abstract:
A controller (110) for controlling an execution of a game program by a processor for enabling an interactive game to be played by a user includes a body (111) having a section to be oriented towards a screen when a progress of a game provided via execution of the game apparatus is displayed upon the screen, and at least one photonically detectable (“PD”) element (e.g. 122, 124, 126, and/or 128) assembled with the body, a position of the photonically detectable element within an image being recordable by an image capture device (112) when the section is oriented at least partly towards the screen, wherein positions of the PD element at different points in time are quantifiable to quantify movement of the body in space.
Abstract:
A game controller includes an image capture unit; a body; at least one input device assembled with the body, the input device manipulable by a user to register an input from the user; an inertial sensor operable to produce information for quantifying a movement of said body through space; and at least one light source assembled with the body; and a processor coupled to the image capture unit and the inertial sensor. The processor is configured to track the body by analyzing a signal from the inertial sensor and analyzing an image of the light source from the image capture unit. The processor is configured to establish a gearing between movement of the body and actions to be applied by a computer program.
Abstract:
A hand-held electronic device, method of operation and computer readable medium are disclosed. The device may include a case having one or more major surfaces. A visual display and a touch interface are disposed on at least one of the major surfaces. A processor is operably coupled to the visual display and touch screen. Instructions executable by the processor may be configured to a) present an image on the visual display containing one or more active elements; b) correlate one or more active portions of the touch interface to one or more corresponding active elements in the image on the visual display; and c) adjust a layout of content shown on the display according to a probability of one or more actions that may be taken with the one or more active elements.
Abstract:
Methods and systems for processing input by a computing device are presented. One method includes operations for receiving images of a control device that includes an object section, and for determining a location of the control device utilizing image analysis for each captured image. Additionally, the movement of the control device is tracked based on the determined locations, where the tracking of the movement includes receiving inertial sensor information obtained by sensors in the control device, and determining an orientation of the control device based on the sensor information. Additionally, the method includes an operation for translating the movement and orientation of the control device into input for a game executing in the computing device, where the input is translated into a motion and orientation of an object in the game based on the movement of the control device.
Abstract:
A hand-held electronic device, method of operation and computer readable medium are disclosed. The device may include a processor is operably coupled to a visual display and touch screen. Instructions executable by the processor may be configured to a) present an image on the visual display containing one or more active elements; b) correlate one or more active portions of the touch interface to one or more corresponding active elements in the image on the visual display; and c) adjust a layout of content shown on the display according to a probability of one or more actions that may be taken with the one or more active elements.
Abstract:
Streaming content may be delivered through a combination of broadcast and a backchannel. A desired streamlet may be selected from the packet of information and presented with a display. A remainder of a data stream associated with the streamlet may be requested and received from via a backchannel while the desired streamlet is being presented. The remaining data stream may then be presented with the display.
Abstract:
A game controller includes an image capture unit; a body; at least one input device assembled with the body, the input device manipulable by a user to register an input from the user; an inertial sensor operable to produce information for quantifying a movement of said body through space; and at least one light source assembled with the body; and a processor coupled to the image capture unit and the inertial sensor. The processor is configured to track the body by analyzing a signal from the inertial sensor and analyzing an image of the light source from the image capture unit. The processor is configured to establish a gearing between movement of the body and actions to be applied by a computer program.
Abstract:
A controller (110) for controlling an execution of a game program by a processor for enabling an interactive game to be played by a user includes a body (111) having a section to be oriented towards a screen when a progress of a game provided via execution of the game apparatus is displayed upon the screen, and at least one photonically detectable (“PD”) element (e.g. 122, 124, 126, and/or 128) assembled with the body, a position of the photonically detectable element within an image being recordable by an image capture device (112) when the section is oriented at least partly towards the screen, wherein positions of the PD element at different points in time are quantifiable to quantify movement of the body in space.
Abstract:
A controller is provided. The controller includes at least one button, an object integrated with a body, and the object defined from a translucent plastic material. Further included is an inertial sensor, an LED device defined to illuminate the object, and a circuit for interpreting input data from the at least one button and the inertial sensor and for communicating data wirelessly. The circuit further configured to interface with the LED device to trigger illumination of the LED to switch from an un-illuminated state to an illuminated color. The LED device is activated depending on data received from a computing system. The activation occurring in response to a state interpreted by the computer program during execution.