Abstract:
A search information acquisition unit acquires a plurality of pieces of search pose information generated for each of a plurality of target images and indicate a pose of a person included in the target image. An exclusion information acquisition unit acquires exclusion pose information indicating a pose of a person included in an exclusion query image. The exclusion query image is an image to be a query of an image needed to be excluded from a search result, and includes at least a person. An exclusion score computation unit computes an exclusion score for each of the plurality of pieces of search pose information. The exclusion score indicates a degree of similarity of the search pose information to the exclusion pose information. An exclusion image selection unit selects, from the plurality of target images by using the exclusion score, an image needed to be excluded from a search result.
Abstract:
A query acquisition unit (610) acquires a plurality of pieces of query information. The query information is information generated for each of a plurality of query images, and indicates a feature of the query image. A similar image selection unit (620) selects, as a similar image by using the plurality of pieces of query information, an image whose degree of similarity to at least one of the query images satisfies a reference. A display control unit (630) displays the similar image selected by the similar image selection unit (620) on a display unit. When the similar image selection unit (620) selects a plurality of the similar images, the display control unit (630) sets a display position or a display order of each of the plurality of similar images on the display unit by using the number of the query images in which the similar image satisfies a reference.
Abstract:
A query acquisition unit (610) acquires pose information (hereinafter, described as query pose information) that is information to be a query and indicates a pose of a person. A search information acquisition unit (620) acquires a plurality of pieces of search pose information. The search pose information is pose information about an image (hereinafter, described as a target image) to be a search target, and is stored in a database by a plurality of target images. A selection unit (630) selects, from among the plurality of pieces of search pose information, two or more pieces of the search pose information whose degree of similarity to the query pose information satisfies a reference. The selection processing becomes substantially processing of selecting a target image similar to an image (hereinafter, described as a query image) associated with the query pose information.
Abstract:
A data management apparatus (2000) is accessible to a first storage region (50) and a first storage region (50). The first storage region (50) stores tree structure data (10). The tree structure data (10) have, as a node, a data set (20) being a set of data (40). A second storage region (60) stores a data set (20) not being included in the tree structure data (10). The data management apparatus (2000) acquires data (40) to be inserted into a data set (20), and inserts the data (40) into the data set (20) being already stored in the first storage region (50) or the second storage region (60), or generates a new data set (20) in the second storage region (60) and inserts the data (40) into the generated data set (20). Further, the data management apparatus (2000) inserts one or more of the data sets (20) into the tree structure data (10), when a predetermined condition is satisfied regarding the data set (20) stored in the second storage region (60).
Abstract:
A data processing apparatus (1) of the present invention includes a unit that retrieves a predetermined subject from moving image data. The data processing apparatus includes a person extraction unit (10) that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed, and an output unit (20) that outputs information regarding the extracted person.
Abstract:
According to the present invention, an analysis system (10) including a generation unit (11) that generates frequency data indicating a temporal change in an occurrence frequency of a predetermined event for each processing object, and an extraction unit (12) that extracts the processing object having a first feature appearing in the frequency data as a possible abnormal object is provided.
Abstract:
Provided is an analysis apparatus (10) including a person extraction unit (11) that analyzes video data to extract a person, a time calculation unit (12) that calculates a continuous appearance time period for which the extracted person has been continuously present in a predetermined area and a reappearance time interval until the extracted person reappears in the predetermined area for each extracted person, and an inference unit (13) that infers a characteristic of the extracted person on the basis of the continuous appearance time period and the reappearance time interval.
Abstract:
A display processing apparatus includes an acquisition unit (101) and an output processing unit (102). The acquisition unit (101) acquires, for each of a plurality of pieces of similarity data extracted by similarity search based on query data, a similarity degree value indicating a similarity with the query data. The output processing unit (102) outputs a display including a region indicating the query data positioned at one end and a plurality of regions indicating respective ones of the plurality of pieces of similarity data positioned at respective positions corresponding to the respective similarity degree values of the respective pieces of similarity data on a line extending from the one end in a predetermined shape.
Abstract:
An abnormality detection apparatus (2000) handles tasks allocated to a plurality of processing servers (3200) as processing targets in a distribution system (3000) having the processing servers (3200). A history acquisition unit (2020) acquires progress history information which is information regarding progress of the plurality of tasks at a plurality of time point of recording. A target range determination unit (2040) determines a target range. A distribution calculation unit (2060) calculates a task speed distribution which is a probability distribution of processing speeds of the tasks using the progress history information regarding the plurality of tasks. An abnormality determination unit (2080) compares a processing speed of a task to be determined with the task speed distribution to thereby determine whether or not the processing speed of the task to be determined is abnormal.
Abstract:
The monitoring apparatus (101) includes an acquisition unit (102) and a processing unit (103). The acquisition unit (102) acquires external appearance information relating to an external appearance of a moving body included in an image captured by a plurality of photographing apparatuses (152) installed in a city. The processing unit (103) executes statistical processing by using the external appearance information.