Abstract:
A method for playing back a program which includes one or more interactive applications. A program which is stored, either in the form of a data stream or as a set of files is played back from a storage device. When playback is started, applications which are detected are launched. Applications are detected in the playback of pushed content when they become available in the playback stream. Applications are detected in the playback of pulled content by comparing the validity ranges of the applications to a current playback index. When special playback modes are used, signals which affect the lifecycle of an application are detected and corresponding signals are generated to maintain the proper state of the application. During these special playback modes, these various notification signals may be conveyed to applications which are configured to operate during these modes. Applications which are not configured to operate during these special playback modes may be terminated when the special modes are initiated and restarted when the special modes end.
Abstract:
A media server machine may be configured to provide media content within a datastream. This datastream may be provided to a media device that is configured to present the media content on a display. Also, this datastream may contemporaneously contain an “app-sync indicator” for the media content. The app-sync indicator is a data structure that signals the media device to launch an application on a companion device. By providing the app-sync indicator contemporaneously with the media content in the datastream, the launching of the application on the companion device may be synchronized with the media content. The app-sync indicator may specify the application to be launched. Also, the app-sync indicator may specify supplemental content to be presented by the launched application on the companion device.
Abstract:
A method to create interactivity between a main device and a secondary device. The method comprises receiving a main stream comprising a signal indicating the availability of the ongoing interactive experience related to the audio/video content and extracting the interactive data from the main stream, obtaining a main interactive application related to the interactive data by the main device, obtaining a secondary interactive application related to the interactive data by the secondary device, loading the main interactive application into a software module of the main device, executing the main interactive application with all or part of the interactive data, collecting by the main interactive application of the main device, result of user's interactions made on the secondary device during execution of the secondary interactive application, processing the received user's interaction by the main interactive application to produce a result, displaying the result on the screen together with the audio/video content.
Abstract:
A server version of an interactive application executed by a processing device of a first mobile device communicatively connected to a main video rendering device collects first data generated by a first client version of the interactive application executed by the processing device. The server version of the interactive application generates first displayable content relating to the first client version of the interactive application based on the first data, wherein the first displayable content is rendered on a display of the first mobile device. The server version of the interactive application generates second displayable content based in part on the first data and additional data relating to the server version of the interactive application, wherein the first displayable content is different from the second displayable content. The server version of the interactive application, transmits the second displayable content to the main video rendering device to be rendered on a main video display.
Abstract:
Methods of presenting video and audio perspectives of audio/video content are presented. In an example embodiment, audio/video content having a plurality of video perspectives and a plurality of audio perspectives of an event is received. Responsive to a first command from a media application configured to allow a user to select among the plurality of video perspectives, a selected first video perspective of the event is provided for presentation. Responsive to a second command from the media application that is also configured to allow a user to select among the plurality of audio perspectives, a selected first audio perspective of the event is associated with the selected first video perspective of the event, and the selected first audio perspective is provided for presentation in association with the selected first video perspective of the event.
Abstract:
A method for providing video and audio perspectives of audio/video content is presented. In an example embodiment, a media content item is received that includes audio/video content having at least one video perspective and a plurality of audio perspectives for the at least one video perspective. The at least one video perspective is provided with a first one of the plurality of audio perspectives for presentation on a display device. During the providing of the at least one video perspective with the first one of the plurality of audio perspectives, a first command is received from a media application configured to allow a user to select among the plurality of audio perspectives. Responsive to the first command, the video perspective is provided along with a selected second one of the plurality of audio perspectives in substitution of the first one of the plurality of audio perspectives for presentation on the display device.
Abstract:
A media server machine may be configured to provide media content within a datastream. This datastream may be provided to a media device that is configured to present the media content on a display. Also, this datastream may contemporaneously contain an “app-sync indicator” for the media content. The app-sync indicator is a data structure that signals the media device to launch an application on a companion device. By providing the app-sync indicator contemporaneously with the media content in the datastream, the launching of the application on the companion device may be synchronized with the media content. The app-sync indicator may specify the application to be launched. Also, the app-sync indicator may specify supplemental content to be presented by the launched application on the companion device.
Abstract:
Disclosed are methods and systems for controlling the playback and recording of television programming containing interactive applications. In particular, the disclosed methods and systems detail how “trick modes” can be handled when playing applications that are distributed with the television programming.
Abstract:
A server version of an interactive application executed by a processing device of a first mobile device communicatively connected to a main video rendering device collects first data generated by a first client version of the interactive application executed by the processing device. The server version of the interactive application generates first displayable content relating to the first client version of the interactive application based on the first data, wherein the first displayable content is rendered on a display of the first mobile device. The server version of the interactive application generates second displayable content based in part on the first data and additional data relating to the server version of the interactive application, wherein the first displayable content is different from the second displayable content. The server version of the interactive application, transmits the second displayable content to the main video rendering device to be rendered on a main video display.
Abstract:
A server version of an interactive application executed by a processing device of a first mobile device communicatively connected to a main video rendering device collects first data generated by a first client version of the interactive application executed by the processing device. The server version of the interactive application generates first displayable content relating to the first client version of the interactive application based on the first data, wherein the first displayable content is rendered on a display of the first mobile device. The server version of the interactive application generates second displayable content based in part on the first data and additional data relating to the server version of the interactive application, wherein the first displayable content is different from the second displayable content. The server version of the interactive application, transmits the second displayable content to the main video rendering device to be rendered on a main video display.