Abstract:
A component of an entropy encoding stage of a block processing pipeline (e.g., a CABAC encoder) may, for a block of pixels in a video frame, accumulate counts indicating the number of times each of two possible symbols is used in encoding a syntax element bin. An empirical probability for each symbol, an estimated entropy, and an estimated rate cost for encoding the bin may be computed, dependent on the symbol counts. A pipeline stage that precedes the entropy encoding stage may, upon receiving another block of pixels for the video frame, calculate and use the estimated rate cost when making encoding decisions for the other block of pixels based on a cost function that includes a rate cost term. The symbol counts or empirical probabilities may be passed to the earlier pipeline stage or written to a shared memory, from which components of the earlier stage may obtain them.
Abstract:
A knight's order processing method for block processing pipelines in which the next block input to the pipeline is taken from the row below and one or more columns to the left in the frame. The knight's order method may provide spacing between adjacent blocks in the pipeline to facilitate feedback of data from a downstream stage to an upstream stage. The rows of blocks in the input frame may be divided into sets of rows that constrain the knight's order method to maintain locality of neighbor block data. Invalid blocks may be input to the pipeline at the left of the first set of rows and at the right of the last set of rows, and the sets of rows may be treated as if they are horizontally arranged rather than vertically arranged, to maintain continuity of the knight's order algorithm.
Abstract:
An electronic device may include an electronic display to display an image based on processed image data. The electronic device may also include image processing circuitry to generate the processed image data. The image processing circuitry may receive input image data corresponding to an image in a first perspective and warp the input image data from the first perspective to a second perspective, generating warped image data. Additionally, the image processing circuitry may determine one or more occluded regions in the second perspective and determine fill-data corresponding to the occluded regions. The processed image data may be generated by combining the warped image data and the fill-data.
Abstract:
In one implementation, a method includes receiving a warped image representing simulated reality (SR) content (e.g., to be displayed in a display space), the warped image having a plurality of pixels at respective locations uniformly spaced in a grid pattern in a warped space, wherein the plurality of pixels are respectively associated with a plurality of respective pixel values and a plurality of respective scaling factors indicating a plurality of respective resolutions at a plurality of respective locations of the SR content (e.g., in the display space). The method includes processing the warped image in the warped space based on the plurality of respective scaling factors to generate a processed warped image and transmitting the processed warped image.
Abstract:
A system may include a display for displaying an image frame that is divided into regions having respective resolutions based on display image data. The system may also include image processing circuitry to generate the display image data based on multi-resolution image data of the image frame. Generating the display image data may include determining an enhancement to be applied to a portion of the multi-resolution image data and adjusting the determined enhancement to be applied to the portion of the multi-resolution image data based on boundary data associated with locations of boundaries between the regions.
Abstract:
In one implementation, a method includes receiving a warped image representing simulated reality (SR) content (e.g., to be displayed in a display space), the warped image having a plurality of pixels at respective locations uniformly spaced in a grid pattern in a warped space, wherein the plurality of pixels are respectively associated with a plurality of respective pixel values and a plurality of respective scaling factors indicating a plurality of respective resolutions at a plurality of respective locations of the SR content (e.g., in the display space). The method includes processing the warped image in the warped space based on the plurality of respective scaling factors to generate a processed warped image and transmitting the processed warped image.
Abstract:
A mixed reality system that includes a device and a base station that communicate via a wireless connection The device may include sensors that collect information about the user's environment and about the user. The information collected by the sensors may be transmitted to the base station via the wireless connection. The base station renders frames or slices based at least in part on the sensor information received from the device, encodes the frames or slices, and transmits the compressed frames or slices to the device for decoding and display. The base station may provide more computing power than conventional stand-alone systems, and the wireless connection does not tether the device to the base station as in conventional tethered systems. The system may implement methods and apparatus to maintain a target frame rate through the wireless link and to minimize latency in frame rendering, transmittal, and display.
Abstract:
An electronic device may include an electronic display to display an image based on processed image data. The electronic device may also include image processing circuitry to generate the processed image data. The image processing circuitry may receive input image data corresponding to an image in a first perspective and warp the input image data from the first perspective to a second perspective, generating warped image data. Additionally, the image processing circuitry may determine one or more occluded regions in the second perspective and determine fill-data corresponding to the occluded regions. The processed image data may be generated by combining the warped image data and the fill-data.
Abstract:
An electronic device may include an electronic display to display an image based on compensated image data in a panel space. The electronic device may also include image processing circuitry to generate the compensated image data. Further, generating the compensated image data may include determining a first inverse mapping of a pixel grid from the panel space to a rendering space and determining a forward mapping of the pixel grid from the rendering space to the panel space based on the first inverse mapping. The forward mapping may include corrections for multiple different warp operations stacked in a single warp operation. Additionally, the image processing circuitry may apply the forward mapping to input image data to generate the compensated image data.
Abstract:
Systems and methods for improving determination of encoded image data using a video encoding pipeline, which includes a first transcode engine that entropy encodes a first portion of a bin stream to determine a first bit stream including first encoded image data that indicates a first coding group row and that determines first characteristic data corresponding to the first bit stream to facilitate communicating a combined bit stream; and a second transcode engine that entropy encodes a second portion of the bin stream to determine a second bit stream including second encoded image data that indicates a second coding group row while the first transcode engine entropy encodes the first portion of the bin stream and that determines second characteristic data corresponding to the second bit stream to facilitate communicating the combined bit stream, which includes the first bit stream and the second bit stream, to a decoding device.