Abstract:
In a networked client/server system, media content is streamed from a server computer to a client computer. A media file format is used to store data for multiple timeline-altered streams that provides support for switching between the different timeline-altered streams during their presentation. According to one embodiment, a time code stream includes multiple data objects mapping corresponding timeline-altered stream data units to primary stream presentation times, and an index table mapping primary stream presentation times to timeline-altered stream byte offsets.
Abstract:
A program distribution system includes a plurality of set-top boxes that receive broadcast programming and segmentation data from content and information providers. The segmentation information indicates portions of programs that are to be included in skimmed or condensed versions of the received programming, and is produced using manual or automated methods. Automated methods include the use of ancillary production data to detect the most important parts of a program. A user interface allows a user to control time scale modification and skimming during playback, and also allows the user to easily browse to different points within the current program.
Abstract:
A Common Annotation Framework includes, in an embodiment, an annotation having a context anchor that identifies a resource and a position in the resource that the annotation pertains to, and a content anchor that identifies data that is annotating the resource. The annotation can also be extended with client application-defined data and/or functionality, and the framework can be extended with one or more of application-defined objects, methods, and annotation stores.
Abstract:
An automated system and method for broadcasting meetings over a computer network. The meeting is filmed using an omni-directional camera system and capable of being presented to a viewer both live and on-demand. The system of the present invention includes an automated camera management system for controlling the camera system and an analysis module determining the location of meeting participants in the meeting environments. The method of the present invention includes using the system of the present invention to broadcast an event to a viewer over a computer network. In particular, the method includes filming the event using an omni-directional camera system. Next, the method determines the location of each event participant in the event environment. Finally, a viewer is provided with a user interface for viewing the broadcast event. This user interface allows a viewer to choose which event participant that the viewer would like to view.
Abstract:
Audio/video programming content is made available to a receiver from a content provider, and meta data is made available to the receiver from a meta data provider. The meta data corresponds to the programming content, and identifies, for each of multiple portions of the programming content, an indicator of a likelihood that the portion is an exciting portion of the content. In one implementation, the meta data includes probabilities that segments of a baseball program are exciting, and is generated by analyzing the audio data of the baseball program for both excited speech and baseball hits. The meta data can then be used to generate a summary for the baseball program.
Abstract:
A program distribution system includes a plurality of set-top boxes that receive broadcast programming and segmentation data from content and information providers. The segmentation information indicates portions of programs that are to be included in skimmed or condensed versions of the received programming, and is produced using manual or automated methods. Automated methods include the use of ancillary production data to detect the most important parts of a program. A user interface allows a user to control time scale modification and skimming during playback, and also allows the user to easily browse to different points within the current program.
Abstract:
Indications of which participant is providing information during a multi-party conference. Each participant has equipment to display information being transferred during the conference. A sourcing signaler residing in the participant equipment provides a signal that indicates the identity of its participant when this participant is providing information to the conference. The source indicators of the other participant equipment receive the signal and cause a UI to indicate that the participant identified by the received signal is providing information (e.g. the UI can causes the identifier to change appearance). An audio discriminator is used to distinguish between an acoustic signal that was generated by a person speaking from that generated in a band-limited manner. The audio discriminator analyzes the spectrum of detected audio signals and generates several parameters from the spectrum and from past determinations to determine the source of an audio signal on a frame-by-frame basis.
Abstract:
A physically-modulated friction stylus system and method for physically modulating friction between a styli tip and a surface of a computing device to emulate the “feel” of different types of writing instruments writing on different types of surfaces (such as pen on paper or a paintbrush on canvas). The actual friction between the stylus and the surface is modulated to produce the “feel.” The friction is physically modulated “on the fly” meaning that friction can be modulated while the stylus tip is in contact with the surface and while the stylus is moving. The friction is modulated dependent on a location of the stylus on the surface and the posture and orientation of the stylus. In addition, the friction can be modulated based on a direction and a velocity that the stylus tip is moving across the surface. Audio may also be used to improve the emulation experience.
Abstract:
A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image. Examples of image projection devices include LED-LCD based projection devices, DLP-based projection devices using LED or laser illumination in combination with micromirror arrays, etc.
Abstract:
Members working on the same project can access similar resources at substantially the same time to facilitate active participation in the project. A meeting associated with the project can be given a unique identifier that can allow the project members to access a meeting or other content and view similar documents or other content as it is discussed in the meeting. As information is edited, modified, created, etc. the members can selectively be presented with the information. A common repository can provide the members with an area or platform in which the project material can be accessed, discussed or other functions performed by the project members, allowing for collaboration of the project details.