Abstract:
An example embodiment may involve causing a page of a document to be printed on a printing device, wherein the printing device is in an AM halftoning mode and prints the page using an AM halftone; displaying, on the display unit, a graphical user interface, wherein the graphical user interface includes a selectable option to switch the printing device from the AM halftoning mode to an FM halftoning mode; receiving an indication that the selectable option has been selected; possibly in response to receiving the indication that the selectable option has been selected, causing the printing device to switch from the AM halftoning mode to the FM halftoning mode; and causing the page of the document to be printed again on the printing device, wherein the printing device is in the FM halftoning mode and prints the page using an FM halftone.
Abstract:
An example embodiment may involve obtaining (i) an a×b attribute macro-cell, and (ii) a×b pixel macro-cells for each of a luminance plane, a first color plane, and a second color plane of an input image. The a×b pixel macro-cells may each contain 4 non-overlapping m×n pixel cells. The example embodiment may also involve determining 4 attribute-plane output values that represent the 4 non-overlapping m×n attribute cells, 1 to 4 luminance-plane output values that represent the a×b pixel macro-cell of the luminance plane, a first color-plane output value to represent the a×b pixel macro-cell of the first color plane, and a second color-plane output value to represent the a×b pixel macro-cell of the second color plane. The example embodiment may further involve writing an interleaved representation of the output values to a computer-readable output medium.
Abstract:
An example embodiment may involve obtaining a digital image containing a pixel block. An AM halftone may be configured to be applied to the digital image by default. The example embodiment may also involve deriving, from the pixel block, a bitmap defining foreground and non-foreground pixels of the pixel block. The example embodiment may also involve sequentially scanning horizontal lines of the bitmap to identify clusters of foreground pixels. Each pixel in a particular cluster of the clusters of foreground pixels may be either (i) the only pixel in the particular cluster, or (ii) vertically or horizontally adjacent to another pixel in the particular cluster. The example embodiment may also involve, possibly based on the clusters of foreground pixels identified in the bitmap, applying an FM halftone to the digital image, and causing the digital image to be printed with the applied FM halftone.
Abstract:
An example embodiment may involve obtaining (i) an a×b attribute macro-cell, and (ii) a×b pixel macro-cells for each of a luminance plane, a first color plane, and a second color plane of an input image. The a×b pixel macro-cells may each contain 4 non-overlapping m×n pixel cells. The example embodiment may also involve determining 4 attribute-plane output values that represent the 4 non-overlapping m×n attribute cells, 1 to 4 luminance-plane output values that represent the a×b pixel macro-cell of the luminance plane, a first color-plane output value to represent the a×b pixel macro-cell of the first color plane, and a second color-plane output value to represent the a×b pixel macro-cell of the second color plane. The example embodiment may further involve writing an interleaved representation of the output values to a computer-readable output medium.
Abstract:
An example embodiment may involve obtaining an a×b pixel macro-cell from an input image. Pixels in the a×b pixel macro-cell may have respective pixel values and may be associated with respective tags. It may be determined whether at least e of the respective tags indicate that their associated pixels represent edges in the input image. Based on this determination, either a first encoding or a second encoding of the a×b pixel macro-cell may be selected. The first encoding may weigh pixels that represent edges in the input image heavier than pixels that do not represent edges in the input image, and the second encoding might not consider whether pixels represent edges. The selected encoding may be performed and written to a computer-readable output medium.
Abstract:
An example embodiment may involve obtaining an m×n pixel cell from an input image. Each of the m×n pixels in the m×n pixel cell may be associated with at least one color value. An m×n attribute cell may be determined, elements of which may be associated in a one-to-one fashion with respective pixels in the m×n pixel cell. The m×n pixel cell may be compressed in a lossy fashion, and the m×n attribute cell may be compressed in a lossless fashion. Compression of the m×n pixel cell may be based on at least part of the m×n attribute cell. An interleaved representation of the compressed m×n pixel cell and the compressed m×n attribute cell may be written to an output medium.
Abstract:
An example embodiment may involve obtaining an m×n pixel cell from an input image. Each of the m×n pixels in the m×n pixel cell may be associated with at least one color value. An m×n attribute cell may be determined, elements of which may be associated in a one-to-one fashion with respective pixels in the m×n pixel cell. The m×n pixel cell may be compressed in a lossy fashion, and the m×n attribute cell may be compressed in a lossless fashion. Compression of the m×n pixel cell may be based on at least part of the m×n attribute cell. An interleaved representation of the compressed m×n pixel cell and the compressed m×n attribute cell may be written to an output medium.
Abstract:
An example embodiment may involve obtaining an a×b pixel macro-cell from an image with one or more color planes, and an a×b attribute macro-cell. The a×b pixel macro-cell may contain 4 non-overlapping m×n pixel cells, and the a×b attribute macro-cell may contain 4 non-overlapping m×n attribute cells. The pixels in the a×b pixel macro-cell may be associated with respective color values. The example embodiment may also involve determining 4 attribute output values associated respectively with the 4 non-overlapping m×n attribute cells. The example embodiment may further involve determining 1 to 4 color-plane output values for the non-overlapping m×n pixel cells, and writing an interleaved representation of the 4 attribute output values and the determined color-plane output values.
Abstract:
A method includes receiving, at a dynamic random access memory (DRAM) device, a single READ-THEN-CLEAR command. The single READ-THEN-CLEAR command has a column address of a column in an array of memory cells. Particular data content is stored in memory cells associated with the column address. The method also includes, in response to receiving the single READ-THEN-CLEAR command, reading the particular data content and clearing the particular data content after reading the particular data content.
Abstract:
Techniques and computing devices related to modifying images are provided. A computing device can receive an order to modify pixels of an image. The computing device can include at least a pixel processor and software snippets that are executable on the pixel processor. The computing device can determine parameter values based on the order. The computing device can select a set of software snippets from the software snippets based on the parameter values. The computing device can load the set of software snippets onto the pixel processor. The pixel processor can execute the loaded set of software snippets to modify the pixels. The computing device can generate an output that includes a depiction of the image that includes at least one of the modified pixels.