Abstract:
A projection system performs projection so that a projection range of a first projection portion and a projection range of a second projection portion are partially overlapped with each other, and includes a processor configured to: perform a control of shifting, from the first state as defined herein, to the second state as defined herein; and execute a control of adjusting a relative position between a first projection range of the first projection portion and a second projection range of the second projection portion in accordance with a received instruction.
Abstract:
A projection apparatus projects a first image projected by a first projection portion and a second image projected by a second projection portion in a partially overlapping manner, and includes: a processor configured to perform a projection control on a first region of the first image set as an overlapping region with the second image and a second region of the second image set as an overlapping region with the first image, and the processor is configured to: in a case where a first operation of providing an instruction to change the first region and the second region is received, perform at least one of: a control of projecting the first image; or a control of projecting the second image; and in a case where a second operation of providing an instruction to confirm the first region is received, finish the control corresponding to the first operation.
Abstract:
A shake correction control device includes an acquisition unit and an operation control unit. The acquisition unit acquires, for each predetermined time, a related amount related to an operation recommendation condition under which an operation of a mechanical correction unit which corrects a shake of a subject image by mechanically moving at least one of a correction optical system or an imaging element is recommended. The operation control unit controls the operation of the mechanical correction unit and an operation of an electronic correction unit which corrects the shake by performing image processing on an image obtained by imaging performed by the imaging element. In a case where a state where the related amount acquired in the acquisition unit satisfies the operation recommendation condition does not continue for a setting period defined by a time series of a plurality of consecutive times, the operation control unit performs a control for operating only the electronic correction unit out of the mechanical correction unit and the electronic correction unit.
Abstract:
A derivation unit derives irradiation position pixel coordinates for specifying the position of a pixel corresponding to an irradiation position of directional light on the real space with respect to a subject, on the basis of the corresponding distance acquired by an acquisition unit, with respect to each of a plurality of second captured images included in a moving image acquired by the acquisition unit, and an execution unit executes a predetermined process as a process to be executed in a position specifiable state with respect to each of the plurality of second captured images, in a case of a position specifiable state where the position of the pixel which is specified by the irradiation position pixel coordinates derived by the derivation unit is the position of a pixel which is specifiable at positions corresponding to each other in the respective first and second captured images.
Abstract:
An execution unit selectively executes a first derivation process of deriving an imaging position distance on the basis of a plurality of pixel coordinates which are present in the same planar region as an irradiation position and which are equal to or more than three pixels in the respective first and second captured images, irradiation position real space coordinates, a focal length of an imaging lens, and dimensions of imaging pixels in a case where the position of the pixel specified by the irradiation position pixel coordinates is the position of a pixel which is specifiable at positions corresponding to each other in the respective first and second captured images, and a second derivation process of deriving the imaging position distance on the basis of the irradiation position real space coordinates, the irradiation position pixel coordinates, the focal length, and the dimensions of the imaging lens.
Abstract:
A received light signal of the measurement light which is reflected from the subject and is then incident on a distance image sensor is acquired from the distance image sensor and a distance image is generated on the basis of the acquired received light signal. The position of a leading end of a medical instrument inserted into the subject is acquired. A leading end position image indicating the position of the leading end of the medical instrument in the subject is acquired. A projection image which is projected to the subject and corresponds to a surface shape of a corresponding part of the subject corresponding to the position of the leading end is generated from the leading end position image, on the basis of the shape of the subject detected from the distance image and the position of the leading end. The projection image is projected to the corresponding part.
Abstract:
A distance measurement device includes a detection unit, an optical path forming unit, a first reduction unit that reduces, based on a detection result of the detection unit, influence of variation of the optical axis of the image formation optical system, a second reduction unit that is disposed in a different part from the common optical path and reduces variation of the optical axis of the directional light based on the detection result of the detection unit, and a control unit that, in the case of operating the first reduction unit and the second reduction unit at the same time, controls the first reduction unit and the second reduction unit to reduce variation of an irradiation position of the directional light in the subject image received as light by the light receiving section.
Abstract:
A distance measurement device includes an emission unit, a detection unit, a first reduction unit that reduces, based on a detection result of the detection unit, influence of variation of an optical axis of the image formation optical system on a subject image received as light by a light receiving section, a second reduction unit that reduces variation of an optical axis of the directional light with respect to the subject based on the detection result of the detection unit, and a control unit that, in the case of operating the first reduction unit and the second reduction unit at the same time, controls the first reduction unit and the second reduction unit to reduce variation of an irradiation position of the directional light in the subject image received as light by the light receiving section.
Abstract:
It is possible to determine whether or not a specific shape candidate obtained from a captured standard image has a corresponding shape in an actual space. A shape surrounded by straight lines is detected from a standard image as a candidate having a rectangular shape. A target object image obtained by imaging a target object representing the candidate having the rectangular shape at a viewing angle different from the viewing angle of the standard image is generated. The generated target object image is detected from a reference image obtained by imaging the target object at a different viewing angle. If the target object image is detected, the target object represented by the candidate having the rectangular shape is determined to be a rectangle.
Abstract:
The present invention provides an image assessment device capable of accurately and promptly assessing an image pair used for 3D measurement from plural captured images. An image assessment device according to the invention includes first captured image selection device, first captured image information acquisition device, object distance to-be-measured acquisition device, object position to-be-measured calculation device, second captured image selection device, second captured image information acquisition device, imaging range calculation device, and assessment device that determines whether or not a calculated object position to be measured is within a calculated imaging range, and assesses that a first captured image and a second captured image are of an image pair if determining that the calculated object position to be measured is within the calculated imaging range.