Abstract:
A method and apparatus for removing blur in an image is disclosed. The blur in the image is caused by relative motion between the imaging device and the object being imaged. A set of differences between the pixel values in the image is calculated. The set of differences in pixel values are divided into two groups, wherein the first group of differences in pixel values corresponds to differences in pixel values due to noise, and the second group of differences in pixel values corresponds to differences in pixel values due to noise and motion. An estimate of the motion blur is determined using the second group of differences in pixel values. The estimate of the blur is then used to remove the blur from an image.
Abstract:
A system for creating a data-bearing image can include a reference image generator configured to apply a clustered-dot halftone screen to a continuous-tone image. The resulting reference halftone image includes carrier cells in which a pixel cluster can be shifted to at least two shift positions in the carrier cell. The system also includes a payload encoder configured to segment the data payload and encode data segments into one of the carrier cells by shifting the cluster to one of the shift positions. The system also includes an output device configured to output the resulting data-bearing halftone image.
Abstract:
Candidate redeye areas (24) are determined in an input image (20). In this process, a respective set of one or more redeye metric values (28) is associated with each of the candidate redeye areas (24). Candidate face areas (30) are ascertained in the input image (20). In this process, a respective set of one or more face metric values (34) is associated with each of the candidate face areas (30). A respective joint metric vector (78) is assigned to each of the candidate redeye areas (24). The joint metric vector (78) includes metric values that are derived from the respective set of redeye metric values (28) and the set of face metric values (34) associated with a selected one of the candidate face areas (30). Each of one or more of the candidate redeye areas (24) is classified as either a redeye artifact or a non-redeye artifact based on the respective joint metric vector (78) assigned to the candidate redeye area (24).
Abstract:
A forensic verification system (1100) extracts a print signature via a print signature extractor (1110) from an interior of a halftone contained in an image. The system (1100) utilizes a comparator (1120) to compare the print signature to a reference signature stored in a registry to determine differences between the print signature and the reference signature. The system (1100) utilizes a forensic analyzer (1130) to perform a forensic analysis on the signatures based on the comparison to authenticate the image.
Abstract:
Document copying systems and methods include initiating a copy job (20) for copying a document (1) having document content. A log (22) of the copy job is generated, and the document is scanned to create a document content image (24). The log is encoded using an input image to produce a data-bearing halftone image (26) which is merged with the document content image (28). The merged data-bearing halftone image and the document content image are printed (30) to produce a copy (2) of the document.
Abstract:
Examples disclosed herein relate to compressing an image by resizing an image and compressing the resized image based on frequency content. A processor may resize an image to a target size if the pixel area of the image is greater than the sum of the target pixel area plus a resizing tolerance. The processor may compress the image using a first data removal rule for a portion of the image of a first frequency range and using a second data removal rule for a portion of the image of a second frequency range.
Abstract:
Methods, and apparatus for performing methods, for classifying an image. Methods include determining a corresponding set of metrics for each region of two or more regions of a pattern of regions of an image, and classifying the image in response to at least the corresponding set of metrics for each of the two or more regions of the pattern of regions.
Abstract:
A forensic verification system (700) extracts a print signature via print signature extractor 710 from the boundary of a halftone contained in an image. The system (700) compares the print signature to a reference signature stored in a registry via comparator 720 to determine differences between the print signature and the reference signature. The system 700 performs a forensic-level statistical image analysis via forensic analyzer 730 on the print signature and the reference signature based on the comparison to authenticate the printed media.
Abstract:
An example provides a system and method of robust alignment and payload recovery for data-bearing images. The method includes digitizing a printed version of a stegatone, computing the transformation parameters of the stegatone, and processing individual local regions of the stegatone to determine local transformation parameters. The method also includes performing an alignment evaluation to compute a metric value that represents the quality of a local alignment between a reference halftone and the stegatone. Further, the method includes selecting alignment parameters based on optimization of the metric value, mapping the shift of clustered-dots in each cell in comparison to the reference halftone, and recovering the payload by decoding the stegatone.
Abstract:
A machine-implemented method of processing an input image includes receiving user input that manually identifies a location in the input image that corresponds to a potential redeye artifact. A set of detected redeye artifacts in the input image is received. One of the detected redeye artifacts that is closest to the manually identified location is identified. The identified detected redeye artifact is stored in a list of redeye artifacts that are identified with manual assistance if the identified detected redeye artifact is within a threshold distance from the manually identified location.