Abstract:
A device such as a color printer includes a main memory, a cache memory, and a convolutional neural network configured to convert pixels from a first color space to a second color space. The convolutional neural network is organized into execution-separable layers, and loaded one or more layer at a time (depending on cache size) from the main memory to the cache memory, whereby the pixels are processed through each of the layers in the cache memory, and layers that have completed processing are evicted to make room for caching next layer(s) of the network.
Abstract:
A device such as a color printer includes a main memory, a cache memory, and a convolutional neural network configured to convert pixels from a first color space to a second color space. The convolutional neural network is organized into execution-separable layers, and loaded one or more layer at a time (depending on cache size) from the main memory to the cache memory, whereby the pixels are processed through each of the layers in the cache memory, and layers that have completed processing are evicted to make room for caching next layer(s) of the network.
Abstract:
Systems and methods upscale an input image by a final upscaling factor. The systems and methods employ a first module implementing a super resolution neural network with feature extraction layers and multiple sets of upscaling layers sharing the feature extraction layers. The multiple sets of upscaling layers upscale the input image according to different respective upscaling factors to produce respective first module outputs. The systems and methods select the first module output with the respective upscaling factor closest to the final upscaling factor. If the respective upscaling factor for the selected first module output is equal to the final upscaling factor, the systems and methods output the selected first module output. Otherwise, the systems and methods provide the selected first module output to a second module that upscales the selected first module output to produce a second module output corresponding to the input image upscaled by the final upscaling factor.
Abstract:
Embodiments provide an automated approach for generating unbiased synthesized image-label pairs for colorization training of retro photographs. Modern grayscale images with corresponding color images are translated to images with the characteristics of retro photographs, thereby producing training data that pairs images with the characteristics of retro paragraphs with corresponding color images. This training data can then be employed to train a deep learning model to colorize retro photographs more effectively.
Abstract:
Methods and apparatus for training and utilizing an artificial neural network (ANN) are provided. A computing device can receive training documents including text. The computing device can parse the training documents to determine training data items. Each training data item can include a training label related to text within the training documents and location information indicating a location of text related to the training label. An ANN can be trained to recognize text using the training data items and training input that includes the training documents. After training the ANN, a request to predict text in application documents that differ from the training documents can be received. The application documents can include second text. A prediction of the second text can be determined by applying the trained ANN to the application documents. After determining the prediction of the second text, information related to the second text can be provided.
Abstract:
A memory control method uses a memory including a plurality of bank groups each having a plurality of banks. The memory control method includes masking write control data and read control data based on an inside-bank group constraint period that is a command to command interval during which a processing is restricted inside an identical bank group and an inter-bank group constraint period that is a command to command interval during which a processing is restricted inside different bank groups, and storing an unmasked command in an arbitration queue. An arbitration raises a priority order of control data requesting a processing on the bank group that has been accessed last among the plurality of bank groups.
Abstract:
An example embodiment may involve causing a page of a document to be printed on a printing device, wherein the printing device is in an AM halftoning mode and prints the page using an AM halftone; displaying, on the display unit, a graphical user interface, wherein the graphical user interface includes a selectable option to switch the printing device from the AM halftoning mode to an FM halftoning mode; receiving an indication that the selectable option has been selected; possibly in response to receiving the indication that the selectable option has been selected, causing the printing device to switch from the AM halftoning mode to the FM halftoning mode; and causing the page of the document to be printed again on the printing device, wherein the printing device is in the FM halftoning mode and prints the page using an FM halftone.
Abstract:
An example embodiment may involve obtaining (i) an a×b attribute macro-cell, and (ii) a×b pixel macro-cells for each of a luminance plane, a first color plane, and a second color plane of an input image. The a×b pixel macro-cells may each contain 4 non-overlapping m×n pixel cells. The example embodiment may also involve determining 4 attribute-plane output values that represent the 4 non-overlapping m×n attribute cells, 1 to 4 luminance-plane output values that represent the a×b pixel macro-cell of the luminance plane, a first color-plane output value to represent the a×b pixel macro-cell of the first color plane, and a second color-plane output value to represent the a×b pixel macro-cell of the second color plane. The example embodiment may further involve writing an interleaved representation of the output values to a computer-readable output medium.
Abstract:
A scanning control for a photocopier includes a pair of identically weighted networks configured to perform feature extraction on images and a memory storing registered security patterns. A match head of the scanning control receives an image pair, wherein a first image of the image pair is generated by a scanning element of the photocopier, and a second image of the image pair is obtained from the registered security patterns, and outputs a match score for the image pair. The output of the match head coupled controls operation of the scanning element.
Abstract:
An example system includes a processor and a non-transitory computer-readable medium having stored therein instructions that are executable to cause the system to perform various functions. The functions include obtaining an image associated with a print job, and providing the image as input to a convolutional neural network. The convolutional neural network includes a residual network, upscaling layers, and classification layers configured to detect whether the image is an artificial image having a computer-generated image gradient. The functions also include determining, based on an output of the classification layers, that the image is an artificial image having a computer-generated image gradient. Further, the functions include, based on determining that the image is an artificial image having a computer-generated image gradient, providing the image to an upscaling module of a print pipeline for upscaling rather than using an output of the upscaling layers for the upscaling.