Distance measurement device, control method for distance measurement, and control program for distance measurement

- FUJIFILM CORPORATION

A distance measurement device includes an imaging unit, a measurement unit, a change unit that is capable of changing an angle at which the directional light is emitted, a deriving unit that derives an in-image irradiation position corresponding to an irradiation position of the directional light onto the subject which is used in measurement, within a captured image based on the distance measured by the measurement unit and the angle, and a control unit that controls the measurement unit to measure the distance and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit until the in-image irradiation position falls in a default range within the captured image in a case where the in-image irradiation image is out of the default range.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation application of International Application No. PCT/JP2016/063582, filed May 2, 2016, the disclosure of which is incorporated herein by reference in its entirety. Further, this application claims priority from Japanese Patent Application No. 2015-171421 filed Aug. 31, 2015, the disclosure of which is incorporated herein by reference in its entirety.

BACKGROUND OF THE INVENTION 1. Field of the Invention

A technology of the present disclosure relates to a distance measurement device, a control method for distance measurement, and a control program for distance measurement.

2. Description of the Related Art

Initially, in the present specification, distance measurement means that a distance to a subject which is a measurement target from a distance measurement device is measured. In the present specification, a captured image means an image acquired by imaging the subject by an imaging unit that images the subject. In the present specification, irradiation-position pixel coordinates mean two-dimensional coordinates as two-dimensional coordinates for specifying a position of a pixel, among pixels included in the captured image, which corresponds to an irradiation position of directional light in a real space by the distance measurement device on the assumption that distance measurement is performed by using the distance measurement device that performs the distance measurement based on a time during which the directional light (for example, laser beam) emitted by an emission unit toward the subject supposed to be a distance measurement target travels in a reciprocating motion. In the present specification, an in-image irradiation position means a position acquired as a position within the captured image, which corresponds to the irradiation position of the directional light in the real space by the distance measurement device. In other words, the in-image irradiation position means a position of a pixel, among the pixels included in the captured image, which is specified by the irradiation-position pixel coordinates.

In recent years, a distance measurement device provided with an imaging unit has been developed. In such a type of distance measurement device, a subject is irradiated with a laser beam, and the subject is captured in a state in which the subject is irradiated with the laser beam. The captured image acquired by imaging the subject is presented to a user, and thus, an irradiation position of the laser beam is ascertained by the user through the captured image.

In recent years, a distance measurement device having a function of deriving a dimension of a target within an image in a real space as in a measurement device described in JP2014-232095A has been also developed.

The measurement device described in JP2014-232095A includes a unit that displays an isosceles trapezoid shape of a structure having an isosceles trapezoid portion captured by the imaging unit and a unit that specifies four vertices of the displayed isosceles trapezoid shape and acquiring coordinates of the four specified vertices. The measurement device described in JP2014-232095A specifies a distance between two points on a plane including the isosceles trapezoid shape or a distance to one point on a plane from the imaging unit, acquires a shape of the structure from the coordinates of the four vertices and a focal length, and acquires a size of the structure from the specified distance.

Incidentally, in a case where a dimension of a target within the captured image acquired by imaging the subject by the imaging unit is derived, a plurality of pixels corresponding to a region as a deriving target in the captured image in the real space is designated by the user. The dimension of the region in the real space which is designated by the user is derived based on the distance measured by the distance measurement device. Thus, in a case where the dimension of the region in the real space specified by the plurality of designated pixels is accurately derived, it is preferable that the in-image irradiation position is derived with high accuracy and the acquired in-image irradiation position together with the distance is ascertained by the user.

SUMMARY OF THE INVENTION

However, P2014-232095A does not describe a unit that derives the in-image irradiation position with high accuracy.

The user designates a region as the dimension deriving target by referring to the in-image irradiation position, but the derived dimension is completely different from an actual dimension in a case where the in-image irradiation position and the irradiation position of the laser beam in the real space are positions on planes of which orientations and positions are different.

In a case where a colored laser beam of which an irradiation position is able to be visually perceived within a distance of about several meters from the distance measurement device is used as the laser beam, the in-image irradiation position may be visually specified and designated from the captured image depending on a diameter and/or intensity of the laser beam. However, for example, in a case where a structure separated from a building site by several tens of meters or several hundreds of meters is irradiated with the laser beam in the daytime, it is difficult to visually specify the in-image irradiation position from the captured image. A method of specifying the in-image irradiation position from a difference between the plurality of captured images acquired in a sequence of time is also considered. However, in a case where the structure separated from the building site by several tens of meters or several hundreds of meters is irradiated with the laser beam, it is difficult to specify the in-image irradiation position. In a case where the in-image irradiation position is not able to be specified, the user performs the distance measurement while the user does not recognize whether or not the subject assumed as the distance measurement target is irradiated with the laser beam.

The embodiment of the present invention has been made in view of such circumstances, and provides a distance measurement device, a control method for distance measurement, and a control program for distance measurement which are capable of performing distance measurement in a state in which an in-image irradiation position is in a default range within a captured image.

A distance measurement device according to a first aspect of the present invention comprises an imaging unit that images a subject, a measurement unit that measures a distance to the subject by emitting directional light which is light having directivity to the subject and receiving reflection light of the directional light, a change unit that is capable of changing an angle at which the directional light is emitted, a deriving unit that derives an in-image irradiation position, which corresponds to an irradiation position of the directional light onto the subject which is used in measurement performed by the measurement unit, within a captured image acquired by imaging the subject by the imaging unit based on the angle and the distance measured by the measurement unit, and a control unit that controls the measurement unit to measure the distance, and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit, until the in-image irradiation position falls in a default range within the captured image in a case where the in-image irradiation position is out of the default range.

Therefore, according to the distance measurement device according to the first aspect of the present invention, it is possible to perform the distance measurement in a state in which the in-image irradiation position is in the default range within a captured image.

According to a second aspect of the present invention, in the distance measurement device according to the first aspect of the present invention, the control unit controls the measurement unit to measure the distance, controls the change unit to change the angle by driving a power source, and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit, until the in-image irradiation position falls in the default range in a case where the in-image irradiation position is out of the default range.

Therefore, according to the distance measurement device according to the second aspect of the present invention, it is possible to reduce an effort to position the in-image irradiation position within the default range compared to a case where the angle is changed by the change unit without using the power source.

According to a third aspect of the present invention, in the distance measurement device according to the second aspect of the present invention, the control unit controls the power source to generate a power for causing the change unit to change the angle in a direction in which a distance between the in-image irradiation position and the default range decreases based on a positional relation between the latest in-image irradiation position and the default range.

Therefore, according to the distance measurement device according to the third aspect of the present invention, it is possible to position the in-image irradiation position within the default range within the captured image with high accuracy compared to a case where the power for causing the change unit to change the angle is not generated by the power source regardless of the positional relation between the latest in-image irradiation position and the default range.

According to a fourth aspect of the present invention, in the distance measurement device according to the first aspect of the present invention, the measurement unit includes an emission unit that emits the directional light, and the change unit includes a rotation mechanism that changes the angle by manually rotating at least the emission unit of the measurement unit.

Therefore, according to the distance measurement device according to the fourth aspect of the present invention, it is possible to easily reflect an intention of the user on the change of the angle at which the directional light is emitted compared to a case where the rotation mechanism for manually rotating the emission unit is not provided.

According to a fifth aspect of the present invention, in the distance measurement device according to any one of the first to fourth aspects of the present invention, the control unit performs the control for a period during which a plurality of captured images acquired by continuously imaging the subject by the imaging unit in a sequence of time is continuously displayed on a first display unit.

Therefore, according to the distance measurement device according to the fifth aspect of the present invention, it is possible to perform the distance measurement in a state in which the in-image irradiation position is in the default range within a captured image while referring to the state of the subject.

According to a sixth aspect of the present invention, the distance measurement device according to any one of the first to fourth aspects of the present invention further comprises: a performing unit that performs at least one of focus adjustment or exposure adjustment on the subject, and a reception unit that receives an imaging preparation instruction to cause the performing unit to start to perform at least one of the focus adjustment or the exposure adjustment before actual exposing is performed by the imaging unit. The control unit performs the control in a case where the imaging preparation instruction is received by the reception unit.

Therefore, according to the distance measurement device according to the sixth aspect of the present invention, it is possible to prevent the in-image irradiation position from entering a state in which the in-image irradiation position is not in the default range at the time of the actual exposing compared to a case where the control unit does not perform the control in a case where the imaging preparation instruction is received by the reception unit.

According to a seventh aspect of the present invention, in the distance measurement device according to any one of the first to fourth aspects of the present invention, the control unit controls the measurement unit to intermittently measure the distance, and the control unit performs the control in a case where a dissimilarity between a distance used in the deriving of the in-image irradiation position performed in a previous stage by the deriving unit and a latest distance measured by the measurement unit is equal to or greater than a threshold value.

Therefore, according to the distance measurement device according to the seventh aspect of the present invention, it is possible to easily to maintain the state in which the in-image irradiation position is in the default range within the captured image compared to a case where the control unit does not perform the control in a case where the dissimilarity is equal to or greater than the threshold value.

According to an eighth aspect of the present invention, in the distance measurement device according to any one of the first to seventh aspects of the present invention, the control unit controls a second display unit to display the captured image, and further controls such that the latest in-image irradiation position derived by the deriving unit is displayed so as to be specified in a display region of the captured image.

Therefore, according to the distance measurement device according to the eighth aspect of the present invention, the user can easily ascertain the latest in-image irradiation position compared to a case where the latest in-image irradiation position is not displayed so as to be specified in the display region of the captured image.

According to a ninth aspect of the present invention, in the distance measurement device according to any one of the first to eighth aspects of the present invention, the control unit controls a third display unit to display the captured image, and further controls such that the default range is displayed so as to be specified in a display region of the captured image.

Therefore, according to the distance measurement device according to the ninth aspect of the present invention, the user can easily ascertain the position of the default range in the display region of the captured image compared to a case where the default range is not displayed so as to be specified in the display region of the captured image.

According to a tenth aspect of the present invention, in the distance measurement device according to any one of the first to ninth aspects of the present invention, the control unit controls a first notification unit to notify that the in-image irradiation position is within the default rage in a case where the in-image irradiation position is within the default range.

Therefore, according to the distance measurement device according to the tenth aspect of the present invention, the user can easily recognize that the in-image irradiation position is in the default range compared to a case where the notification indicating that the in-image irradiation position is in the default range is not performed in a case where the in-image irradiation position is in the default range.

According to an eleventh aspect of the present invention, in the distance measurement device according to any one of the first to tenth aspects of the present invention, the control unit controls a second notification unit to notify that the in-image irradiation position is out of the default range in a case where the in-image irradiation position is out of the default range.

Therefore, according to the distance measurement device according to the eleventh aspect of the present invention, the user can easily recognize that the in-image irradiation position is out of the default range compared to a case where the notification indicating that the in-image irradiation position is out of the default range is not performed in a case where the in-image irradiation position is out of the default range.

A control method for distance measurement according to a twelfth aspect of the present invention comprises deriving an in-image irradiation position, which corresponds to an irradiation position of directional light which is light having directivity on to a subject used in measurement performed by a measurement unit that measures a distance to the subject by emitting the directional light to the subject and receiving reflection light of the directional light, within a captured image acquired by imaging the subject by an imaging unit that images the subject, based on the distance measured by the measurement unit and an angle changed by a change unit that is capable of changing the angle at which the directional light is emitted, the imaging unit, the measurement unit, and the change unit being included in a distance measurement device, and controlling the measurement unit to measure the distance and controlling the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit until the in-image irradiation position falls in a default range within a captured image in a case where the in-image irradiation position is out of the default range.

Therefore, according to the control method for distance measurement according to the twelfth aspect of the present invention, it is possible to perform the distance measurement in a state in which the in-image irradiation position is in the default range within a captured image.

A control program for distance measurement according to a thirteenth aspect of the present invention comprises deriving an in-image irradiation position, which corresponds to an irradiation position of directional light which is light having directivity on to a subject used in measurement performed by a measurement unit that measures a distance to the subject by emitting the directional light to the subject and receiving reflection light of the directional light, within a captured image acquired by imaging the subject by an imaging unit that images the subject, based on the distance measured by the measurement unit and an angle changed by a change unit that is capable of changing the angle at which the directional light is emitted, the imaging unit, the measurement unit, and the change unit being included in a distance measurement device, and controlling the measurement unit to measure the distance and controlling the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit until the in-image irradiation position falls in a default range within a captured image in a case where the in-image irradiation position is out of the default range.

Therefore, according to the control program for distance measurement according to the thirteenth aspect of the present invention, it is possible to perform the distance measurement in a state in which the in-image irradiation position is in the default range within a captured image.

According to the embodiment of the present invention, an effect capable of performing distance measurement in a state in which an in-image irradiation position is in a default range within a captured image is acquired.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a front view showing an example of an external appearance of a distance measurement device according to first to fourth embodiments.

FIG. 2 is a conceptual diagram (schematic side view) showing an example of a schematic configuration of a longitudinal rotation mechanism provided in the distance measurement device according to the first to fifth embodiments.

FIG. 3 is a conceptual diagram (schematic front view) showing an example of a schematic configuration of a horizontal rotation mechanism provided in the distance measurement device according to the first to fifth embodiments.

FIG. 4 is a block diagram showing an example of a hardware configuration of main parts of the distance measurement device according to the first to third embodiments.

FIG. 5 is a time chart showing an example of a measurement sequence using the distance measurement device according to the first to fifth embodiments.

FIG. 6 is a time chart showing an example of a laser trigger, a light-emitting signal, a light-receiving signal, and a count signal required in a case where measurement using the distance measurement device according to the first to fifth embodiments is performed once.

FIG. 7 is a graph showing an example of a histogram (a histogram in a case where a lateral axis represents a distance (measurement value) to the subject and a longitudinal axis represents the number of times the measurement is performed) of measurement values acquired in the measurement sequence using the distance measurement device according to the first to fifth embodiments.

FIG. 8 is a block diagram showing an example of a hardware configuration of a main control unit included in the distance measurement device according to the first to fourth embodiments.

FIG. 9 is an explanatory diagram for describing a method of measuring a dimension (length) of a designated region.

FIG. 10 is a functional block diagram showing an example of functions of main parts realized by a CPU of the main control unit included in the distance measurement device according to the first to fourth embodiments.

FIG. 11 is a flowchart showing an example of a flow of a distance measurement process according to the first to fourth embodiments.

FIG. 12 is a flowchart subsequent to the flowchart shown in FIG. 11.

FIG. 13 is a flowchart subsequent to the flowchart shown in FIG. 11.

FIG. 14 is a conceptual diagram showing an example of a correspondence table according to the first to third embodiments.

FIG. 15 is an explanatory diagram for describing a parameter that influences an in-image irradiation position.

FIG. 16 is a screen diagram showing an example of a first intention check screen according to the first to fifth embodiments.

FIG. 17 is a screen diagram showing an example of a provisional measurement and provisional imaging guide screen according to the first to fifth embodiments.

FIG. 18 is a screen diagram showing an example of a re-performing guide screen according to the first to fifth embodiments.

FIG. 19 is a screen diagram showing an example of a second intention check screen according to the first to fifth embodiments.

FIG. 20 is a screen diagram showing an example of a screen in a state in which an actual image, a distance, and an irradiation position mark are displayed on a display unit according to the first to fifth embodiments.

FIG. 21 is a conceptual diagram showing an example in which a distance is in a correspondence information distance range, is out of a first correspondence information distance range, and is out of a second correspondence information distance range according to the first to fifth embodiments.

FIG. 22 is a screen diagram showing an example of a screen in a state in which an actual image, a distance, an irradiation position mark, and a warning and recommendation message are displayed on the display unit according to the first to fifth embodiments.

FIG. 23 is a flowchart showing an example of a flow of an irradiation position adjustment process according to the first embodiment.

FIG. 24 is a screen diagram showing an example of a live view image and a frame displayed on the display unit by performing the irradiation position adjustment process.

FIG. 25 is a screen diagram showing an example of a live view image, a frame, an irradiation position mark, and a message corresponding to out-of-default-range information displayed on the display unit by performing the irradiation position adjustment process.

FIG. 26 is a screen diagram showing an example of a live view image, a frame, an irradiation position mark, and a message corresponding to in-default-range information displayed on the display unit by performing the irradiation position adjustment process.

FIG. 27 is a flowchart showing an example of a flow of an irradiation position adjustment process according to the second embodiment.

FIG. 28 is a flowchart showing an example of a flow of an irradiation position adjustment process according to the third embodiment.

FIG. 29 is a flowchart showing an example of a flow of an irradiation position adjustment process according to the fourth embodiment.

FIG. 30 is a block diagram showing an example of a hardware configuration of main parts of the distance measurement device according to the fourth embodiment.

FIG. 31 is a block diagram showing an example of a hardware configuration of main parts of the distance measurement device according to the fifth embodiment.

FIG. 32 is a screen diagram showing an example of a screen including an actual measurement and actual imaging button, a provisional measurement and provisional imaging button, an imaging system operation mode switching button, a wide angle instruction button, a telephoto instruction button, and an irradiation position adjustment button displayed as soft keys on a display unit of a smart device according to the fifth embodiment.

FIG. 33 is a conceptual diagram showing an example of an aspect in which a distance measurement program and an irradiation position adjustment program are installed in the distance measurement device from a storage medium that stores a distance measurement program and an irradiation position adjustment program according to the first to fifth embodiments.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

Hereinafter, an example of an embodiment related to a technology of the present disclosure will be described with reference to the accompanying drawings. In the present embodiment, a distance between a distance measurement device and a subject as a measurement target is simply referred to as a distance for the sake of convenience in description. In the present embodiment, an angle of view (an angle of view on a subject image indicating the subject) on the subject is simply referred to as an “angle of view.

First Embodiment

For example, a distance measurement device 10A according to the first embodiment includes a distance measurement unit 12 and an imaging device 14 as shown in FIG. 1. In the present embodiment, the distance measurement unit 12 and a distance measurement control unit 68 (see FIG. 4) are an example of a measurement unit according to the technology of the present disclosure, and the imaging device 14 is an example of an imaging unit according to the technology of the present disclosure.

The imaging device 14 includes a lens unit 16 and an imaging device main body 18, and the lens unit 16 is detachably attached to the imaging device main body 18.

A hot shoe 20 is provided on a top surface of the imaging device main body 18, and the distance measurement unit 12 is detachably attached to the hot shoe 20.

The distance measurement device 10A has a distance measurement system function of measuring a distance by emitting a laser beam for distance measurement to the distance measurement unit 12, and an imaging system function of causing the imaging device 14 to acquire a captured image by imaging the subject. Hereinafter, the captured image acquired by imaging the subject by using the imaging device 14 by utilizing the imaging system function is simply referred to as an “image” or a “captured image” for the sake of convenience in description.

The distance measurement device 10A performs one measurement sequence (see FIG. 5) according to one instruction by utilizing the distance measurement system function, and ultimately outputs one distance by performing the one measurement sequence. In the present embodiment, actual measurement and provisional measurement are selectively performed by utilizing the distance measurement system function according to an instruction of a user in a distance measurement process to be described below (see FIGS. 11 to 13). The actual measurement means measurement in which a distance measured by utilizing the distance measurement system function is actually used, and the provisional measurement means measurement performed in a preparation stage of increasing the accuracy of the actual measurement.

The distance measurement device 10A has, as an operation mode of the imaging system function, a still image imaging mode and a video imaging mode. The still image imaging mode is an operation mode for imaging a still image, and the video imaging mode is an operation mode of imaging a motion picture. The still image imaging mode and the video imaging mode are selectively set according to an instruction of the user.

In the present embodiment, the actual imaging and the provisional imaging are selectively performed by utilizing the imaging system function according to an instruction of the user in the distance measurement process to be described below (see FIGS. 11 to 13). The actual imaging is imaging performed in synchronization with the actual measurement, and the provisional imaging is imaging performed in synchronization with the provisional measurement. Hereinafter, for the sake of convenience in description, an image acquired through the actual imaging is referred to as an “actual captured image”, and an image acquired through the provisional imaging is referred to as a “provisional captured image”. In a case where it is not necessary to distinguish between the “actual captured image” and the “provisional captured image”, the actual captured image and the provisional captured image are referred to as an “image” or a “captured image”. Hereinafter, for the sake of convenience in description, the “actual captured image” is also referred to as an “actual image”, and the “provisional captured image” is also referred to as a “provisional image”.

For example, the imaging device main body 18 includes a longitudinal rotation mechanism 13 as shown in FIG. 2. The longitudinal rotation mechanism 13 receives a power generated by a motor 17 (see FIG. 4) to be described below, and rotates the hot shoe 20 in a front-view longitudinal direction with a front end portion of the hot shoe 20 as a rotational axis. Accordingly, the hot shoe 20 to which the distance measurement unit 12 is attached is rotated by the longitudinal rotation mechanism 13 in the longitudinal direction in front view, and thus, an orientation of the distance measurement unit 12 is changed in the front-view longitudinal direction (for example, an A direction represented in FIG. 2) in the front-view longitudinal direction. For the sake of convenience in description, although it has been described in the example shown in FIG. 2 that the hot shoe 20 is rotated in the front-view longitudinal direction such that a rear end portion of the hot shoe 20 is buried within the imaging device main body 18, the technology of the present disclosure is not limited thereto. For example, the hot shoe 20 may be rotated in the front-view longitudinal direction such that the rear end of the hot shoe 20 is pushed up from the imaging device main body 18. Hereinafter, for the sake of convenience in description, the front-view longitudinal direction is simply referred to as a “longitudinal direction”.

For example, the imaging device main body 18 includes a horizontal rotation mechanism 15, as shown in FIG. 3. The horizontal rotation mechanism 15 receives a power generated by a motor 19 (see FIG. 4) to be described below, and rotates the hot shoe 20 in a front-view horizontal direction with a central point of the hot shoe 20 in plan view as a rotational axis. Accordingly, the hot shoe 20 to which the distance measurement unit 12 is attached is rotated by the horizontal rotation mechanism 15 in the front-view horizontal direction, and thus, an orientation of the distance measurement unit 12 is changed in the front-view horizontal direction (for example, a B direction represented in FIG. 2). Hereinafter, for the sake of convenience in description, the front-view horizontal direction is simply referred to as a “horizontal direction”.

Hereinafter, the longitudinal rotation mechanism and the horizontal rotation mechanism are referred to as a “rotation mechanism” without being assigned the reference for the sake of convenience in description in a case where it is not necessary to distinguish between the longitudinal rotation mechanism 13 and the horizontal rotation mechanism 15.

For example, the distance measurement unit 12 includes an emission unit 22, a light receiving unit 24, and a connector 26, as shown in FIG. 4.

The connector 26 is able to be connected to the hot shoe 20, and the distance measurement unit 12 is operated under the control of the imaging device main body 18 in a state in which the connector 26 is connected to the hot shoe 20.

The emission unit 22 includes a laser diode (LD) 30, a condenser lens (not shown), an object lens 32, and an LD driver 34.

The condenser lens and the object lens 32 are provided along an optical axis of a laser beam emitted by the LD 30, and the condenser lens and the object lens 32 are arranged in order along the optical axis from the LD 30.

The LD 30 emits a laser beam for distance measurement which is an example of directional light according to the technology of the present disclosure. The laser beam emitted by the LD 30 is a colored laser beam. For example, as long as the subject is separated from the emission unit 22 in a range of about several meters, an irradiation position of the laser beam is visually recognized in a real space, and is visually recognized from the captured image acquired by the imaging device 14.

The condenser lens concentrates the laser beam emitted by the LD 30, and causes the concentrated laser beam to pass. The object lens 32 faces the subject, and emits the laser beam that passes through the condenser lens to the subject.

The LD driver 34 is connected to the connector 26 and the LD 30, and drives the LD 30 in order to emit the laser beam according to an instruction of the imaging device main body 18.

The light receiving unit 24 includes a photodiode (PD) 36, an object lens 38, and a light-receiving signal processing circuit 40. The object lens 38 is disposed on a light receiving surface of the PD 36. After the laser beam emitted by the emission unit 22 reaches the subject, a reflection laser beam which is a laser beam reflected from the subject is incident on the object lens 38. The object lens 38 factors the reflection laser beam to pass, and guides the reflection laser beam to the light receiving surface of the PD 36. The PD 36 receives the reflection laser beam that passes through the object lens 38, and outputs an analog signal corresponding to a light reception amount, as a light-receiving signal.

The light-receiving signal processing circuit 40 is connected to the connector 26 and the PD 36, amplifies the light-receiving signal input from the PD 36 by an amplifier (not shown), and performs analog-to-digital (A/D) conversion on the amplified light-receiving signal. The light-receiving signal processing circuit 40 outputs the light-receiving signal digitized through the A/D conversion to the imaging device main body 18.

The imaging device 14 includes mounts 42 and 44. The mount 42 is provided at the imaging device main body 18, and the mount 44 is provided at the lens unit 16. The lens unit 16 is attached to the imaging device main body 18 so as to be replaceable by coupling the mount 42 to the mount 44.

The lens unit 16 includes an imaging lens 50, a zoom lens 52, a zoom lens moving mechanism 54, and a motor 56.

Subject light which is reflected from the subject is incident on the imaging lens 50. The imaging lens 50 factors the subject light to pass, and guides the subject light to the zoom lens 52.

The zoom lens 52 is attached to the zoom lens moving mechanism 54 so as to slide along the optical axis. The motor 56 is connected to the zoom lens moving mechanism 54. The zoom lens moving mechanism 54 receives a power of the motor 56, and factors the zoom lens 52 to slide along an optical axis direction.

The motor 56 is connected to the imaging device main body 18 through the mounts 42 and 44, and the driving of the motor is controlled according to a command from the imaging device main body 18. In the present embodiment, a stepping motor is used as an example of the motor 56. Accordingly, the motor 56 is operated in synchronization with a pulsed power according to a command from the imaging device main body 18.

The imaging device main body 18 includes an imaging element 60, a main control unit 62, an image memory 64, an image processing unit 66, a distance measurement control unit 68, motors 17 and 19, motor drivers 21, 23, and 72, an imaging element driver 74, an image signal processing circuit 76, and a display control unit 78. The imaging device main body 18 includes a touch panel interface (I/F) 79, a reception I/F 80, and a media I/F 82. The longitudinal rotation mechanism 13, the horizontal rotation mechanism 15, the motors 17 and 19, and the motor drivers 21 and 23 are examples of a change unit according to the technology of the present disclosure. For example, the change unit according to the technology of the present disclosure means a mechanism capable of changing an emission angle β to be described below.

The main control unit 62, the image memory 64, the image processing unit 66, the distance measurement control unit 68, the motor drivers 21, 23, and 72, the imaging element driver 74, the image signal processing circuit 76, and the display control unit 78 are connected to a busline 84. The touch panel I/F 79, the reception I/F 80, and the media I/F 82 are also connected to the busline 84.

The imaging element 60 is a complementary metal oxide semiconductor (CMOS) type image sensor, and includes a color filter (not shown). The color filter includes a G filter corresponding to green (G), an R filter corresponding to red (R), and a B filter corresponding to blue (B) which contribute to the acquisition of a brightness signal. The imaging element 60 includes a plurality of pixels (not shown) arranged in a matrix shape, and any filter of the R filter, the G filter, and the B filter included in the color filter is allocated to each pixel.

The subject light that passes through the zoom lens 52 is formed on an imaging surface which is the light receiving surface of the imaging element 60, and electric charges corresponding to the light reception amount of the subject light are accumulated in the pixels of the imaging element 60. The imaging element 60 outputs the charges accumulated in the pixels, as image signals indicating an image corresponding to a subject image acquired by forming the subject light on the imaging surface.

For example, the motor 17 is connected to the longitudinal rotation mechanism 13, and the longitudinal rotation mechanism 13 receives the power of the motor 17 and rotates the hot shoe 20 in the longitudinal direction. For example, the distance measurement unit 12 is rotated in the direction of an arrow A, as shown in FIG. 2. The motor 19 is connected to the horizontal rotation mechanism 15, and the horizontal rotation mechanism 15 receives the power of the motor 19 and rotates the hot shoe 20 in the horizontal direction. For example, the distance measurement unit 12 is rotated in the direction of an arrow B, as shown in FIG. 3.

The main control unit 62 controls the entire distance measurement device 10A through the busline 84.

The motor driver 21 controls the motor 17 according to an instruction of the main control unit 62. The motor driver 23 controls the motor 19 according to an instruction of the main control unit 62. The motors 17 and 19 are examples of a power source according to the technology of the present disclosure.

The motor driver 72 is connected to the motor 56 through the mounts 42 and 44, and controls the motor 56 according to an instruction of the main control unit 62.

In the present embodiment, a stepping motor is used as an example of the motors 17, 19, and 56. Accordingly, the motors 17, 19, and 56 are operated in synchronization with a pulsed power according to a command from the main control unit 62.

The imaging device 14 has an angle-of-view changing function. The angle-of-view changing function is a function of changing an angle of view on the subject by moving the zoom lens 52. In the present embodiment, the angle-of-view changing function is realized by the zoom lens 52, the zoom lens moving mechanism 54, the motor 56, the motor driver 72, and the main control unit 62. Although it has been described in the present embodiment that the optical angle-of-view changing function using the zoom lens 52 is used, the technology of the present disclosure is not limited thereto, and an electronic angle of view changing function without using the zoom lens 52 may be used.

The imaging element driver 74 is connected to the imaging element 60, and supplies drive pulses to the imaging element 60 under the control of the main control unit 62. The pixels of the imaging element 60 are driven according to the drive pulses supplied by the imaging element driver 74.

The image signal processing circuit 76 is connected to the imaging element 60, and reads image signals corresponding to one frame for every pixel out of the imaging element 60 under the control of the main control unit 62. The image signal processing circuit 76 performs various processing tasks such as correlative double sampling processing, automatic gain adjustment, and A/D conversion on the readout image signals. The image signal processing circuit 76 outputs image signals digitized by performing various processing tasks on the image signals for every frame to the image memory 64 at a specific frame rate (for example, tens of frames/second) prescribed by an analog signal supplied from the main control unit 62. The image memory 64 provisionally retains the image signals input from the image signal processing circuit 76.

The imaging device main body 18 includes a display unit 86, a touch panel 88, a reception device 90, and a memory card 92.

An alarm unit and the display unit 86 which is an example of a first display unit, a second display unit, a third display unit, a first notification unit, and a second notification unit according to the technology of the present disclosure are connected to the display control unit 78, and display various information items under the control of the display control unit 78. The display unit 86 is realized by a liquid crystal display (LCD), for example.

The touch panel 88 is layered on a display screen of the display unit 86, and senses touch using a pointer such as a finger of the user and/or a touch pen. The touch panel 88 is connected to the touch panel I/F 79, and outputs positional information indicating a position touched by the pointer to the touch panel I/F 79. The touch panel I/F 79 activates the touch panel 88 according to an instruction of the main control unit 62, and outputs the positional information input from the touch panel 88 to the main control unit 62.

The reception device 90 includes an actual measurement and actual imaging button 90A, a provisional measurement and provisional imaging button 90B, an imaging system operation mode switching button 90C, a wide angle instruction button 90D, a telephoto instruction button 90E, and an irradiation position adjustment button 90F, and receives various instructions from the user. The reception device 90 is connected to the reception I/F 80, and the reception I/F 80 outputs an instruction content signal indicating the content of the instruction received by the reception device 90 to the main control unit 62.

The actual measurement and actual imaging button 90A is a pressing type button that receives an instruction to start the actual measurement and the actual imaging. The provisional measurement and provisional imaging button 90B is a pressing type button that receives an instruction to start the provisional measurement and the provisional imaging. The imaging system operation mode switching button 90C is a pressing type button that receives an instruction to switch between the still image imaging mode and the video imaging mode.

The wide angle instruction button 90D is a pressing type button that receives an instruction to change the angle of view to a wide angle, and a degree of the angle of view changed to the wide angle is determined in an allowable range depending on a pressing time during which the wide angle instruction button 90D is continuously pressed.

The telephoto instruction button 90E is a pressing type button that receives an instruction to change the angle of view to an angle of a telephoto lens, and a degree of the angle of view changed to the angle of the telephoto lens is determined in an allowable range depending on a pressing time during which the telephoto instruction button 90E is continuously pressed.

The irradiation position adjustment button 90F is a pressing type button that receives an instruction to adjust an in-image irradiation position. In a case where the irradiation position adjustment button 90F is pressed, an irradiation position adjustment process (see FIG. 23) to be described below is started to be performed.

Hereinafter, the actual measurement and actual imaging button and the provisional measurement and provisional imaging button are referred to as a “release button” for the sake of convenience in description in a case where it is not necessary to distinguish between the actual measurement and actual imaging button 90A and the provisional measurement and provisional imaging button 90B. Hereinafter, the wide angle instruction button and the telephoto instruction button are referred to as an “angle-of-view instruction button” for the sake of convenience in description in a case where it is not necessary to distinguish between the wide angle instruction button 90D and the telephoto instruction button 90E.

In the distance measurement device 10A according to the first embodiment, a manual focus mode and an auto focus mode are selectively set according to an instruction of the user through the reception device 90 in the still image imaging mode.

In the auto focus mode, the release button which is an example of a reception unit according to the technology of the present disclosure receives two-step pressing operations including an imaging preparation instruction state and an imaging instruction state. For example, the imaging preparation instruction state refers to a state in which the release button is pressed down from a waiting position to an intermediate position (half pressed position), and the imaging instruction state refers to a state in which the release button is pressed down to a finally pressed-down position (fully pressed position) beyond the intermediate position.

Hereinafter, for the sake of convenience in description, a “state in which the release button is pressed down from the waiting position to the half pressed position” is referred to as a “half pressed state”, and a “state in which the release button is pressed down from the waiting position to the fully pressed position” is referred to as a “fully pressed state”.

In the auto focus mode, after an imaging condition is adjusted by setting the release button to be in the half pressed state, actual exposing is subsequently performed by setting the release button to be in the fully pressed state. That is, in a case where the release button is set to be in the half pressed state before the actual exposing is performed, an automatic exposure (AE) function, and thus, exposure is adjusted. Thereafter, a focus is adjusted by performing auto-focus (AF) function, and the actual exposing is performed in a case where the release button is set to be in the fully pressed state.

In this example, the actual exposing refers to exposing performed in order to acquire a still image file to be described below. In the present embodiment, the exposing means exposing performed in order to acquire a live view image to be described below and exposition performed in order to acquire a motion picture image file to be described below in addition to the actual exposing. Hereinafter, for the sake of convenience in description, the exposing is simply referred to as “exposing” in a case where it is not necessary to distinguish between these exposing tasks.

In the present embodiment, the main control unit 62 which is an example of a performing unit according to the technology of the present disclosure performs the exposure adjustment using the AE function and the focus adjustment using the AF function. Although it has been described in the present embodiment that the exposure adjustment and the focus adjustment are performed by the main control unit 62, the technology of the present disclosure is not limited to thereto, and the exposure adjustment or the focus adjustment may not be performed by the main control unit 62.

The image processing unit 66 acquires image signals for every frame from the image memory 64 at a specific frame rate, and performs various processing tasks such as gamma correction, luminance and color difference conversion, and compression processing on the acquired image signals.

The image processing unit 66 outputs the image signals acquired by performing various processing tasks to the display control unit 78 for every frame at a specific frame rate. The image processing unit 66 outputs the image signals acquired by performing various processing tasks to the main control unit 62 according to a request of the main control unit 62.

The display control unit 78 outputs the image signals input from the image processing unit 66 to the display unit 86 for every frame at a specific frame rate under the control of the main control unit 62.

The display unit 86 displays image and character information. The display unit 86 displays the image indicated by the image signals input from the display control unit 78 at a specific frame rate, as a live view image. As the live view image, a plurality of images acquired by performing continuous imaging by the imaging device 14 in a sequence of time, that is, continuous frame images acquired by performing imaging in continuous frames is acquired, and the live view image is referred to as a live preview image. The display unit 86 also displays the still image which is a single frame image captured in a single frame. The display unit 86 also displays a playback image and/or a menu screen in addition to the live view image.

Although the image processing unit 66 and the display control unit 78 are realized by an application specific integrated circuit (ASIC) in the present embodiment, the technology of the present disclosure is not limited thereto. For example, the image processing unit 66 and the display control unit 78 may be realized by a field-programmable gate array (FPGA). The image processing unit 66 may be realized by a computer including a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM). The display control unit 78 may also be realized by a computer including a CPU, a ROM, and a RAM. The image processing unit 66 and the display control unit 78 may be realized by combining of a hardware configuration and a software configuration.

In a case where an instruction to image the still image is received by the release button in the still image imaging mode, the main control unit 62 factors the imaging element 60 to expose one frame by controlling the imaging element driver 74. The main control unit 62 acquires the image signals acquired by exposing one frame from the image processing unit 66, and generates the still image file having a specific still image format by performing a compression process on the acquired image signals. For example, the specific still image format refers to the Joint Photographic Experts Group (JPEG).

In a case where an instruction to image the motion picture is received by the release button in the video imaging mode, the main control unit 62 acquire the image signals output to the display control unit 78 in order to be used as the live view image, by the image processing unit 66 for every frame at a specific frame rate. The main control unit 62 generates a motion picture file having a specific motion picture format by performing the compression process on the image signals acquired from the image processing unit 66. For example, the specific motion picture format refers to the Moving Picture Experts Group (MPEG). Hereinafter, the still image file and the motion picture file are referred to as the image file for the sake of convenience in description in a case where it is not necessary to distinguish between the still image file and the motion picture file.

The media I/F 82 is connected to the memory card 92, and records and reads the image file in and out of the memory card 92 under the control of the main control unit 62. The main control unit 62 performs a decompression process on the image file read out of the memory card 92 by the media I/F 82, and displays the decompressed image file as a playback image on the display unit 86.

The main control unit 62 stores distance measurement information including at least one of distance information input from the distance measurement control unit 68 or dimension information indicating a dimension derived by utilizing a dimension deriving function to be described below in association with the image file in the memory card 92 through the media I/F 82. The distance measurement information together with the image file is read out of the memory card 92 by the main control unit 62 through the media I/F 82. In a case where the distance information is included in the distance measurement information read out by the main control unit 62, the distance indicated by the distance information together with the playback image which is the associated image file is displayed on the display unit 86. In a case where the dimension information is included in the distance measurement information read out by the main control unit 62, the dimension indicated by the dimension information together with the playback image which is the associated image file is displayed on the display unit 86.

The distance measurement control unit 68 controls the distance measurement unit 12 under the control of the main control unit 62. In the present embodiment, the distance measurement control unit 68 is realized by ASIC, but the technology of the present disclosure is not limited thereto. For example, the distance measurement control unit 68 may be realized by FPGA. The distance measurement control unit 68 may be realized by a computer including a CPU, a ROM, and a RAM. The distance measurement control unit 68 may be realized by the combination of the hardware configuration and the software configuration.

The hot shoe 20 is connected to the busline 84. Under the control of the main control unit 62, the distance measurement control unit 68 controls the emission of the laser beam from the LD 30 by controlling the LD driver 34, and acquires light-receiving signal from the light-receiving signal processing circuit 40. The distance measurement control unit 68 derives a distance to the subject based on a timing when the laser beam is emitted and a timing when the light-receiving signal is acquired, and outputs distance information indicating the derived distance to the main control unit 62.

The measurement of the distance to the subject using the distance measurement control unit 68 will be described in more detail.

For example, one measurement sequence using the distance measurement device 10A is prescribed by a voltage adjustment period, an actual measurement period, and a suspension period, as shown in FIG. 5.

The voltage adjustment period is a period during which driving voltages of the LD 30 and the PD 36 are adjusted. The actual measurement period is a period during which the distance to the subject is actually measured. For the actual measurement period, an operation for causing the LD 30 to emit the laser beam and causing the PD 36 to receive the reflection laser beam hundreds of times is repeated several hundreds of times, and the distance to the subject is derived based on the timing when the laser beam is emitted and the timing when the light-receiving signal is acquired. The suspension period is a period during which the driving of the LD 30 and the PD 36 is suspended. Thus, in one measurement sequence, the measurement of the distance to the subject is performed hundreds of times.

In the present embodiment, each of the voltage adjustment period, the actual measurement period, and the suspension period is hundreds of milliseconds.

For example, as shown in FIG. 6, count signals that prescribes a timing when the distance measurement control unit 68 outputs an instruction to emit the laser beam and a timing when the distance measurement control unit 68 acquires the light-receiving signal are supplied to the distance measurement control unit 68. In the present embodiment, the count signals are generated by the main control unit 62 and are supplied to the distance measurement control unit 68, but the present embodiment is not limited thereto. The count signals may be generated by a dedicated circuit such as a time counter connected to the busline 84, and may be supplied to the distance measurement control unit 68.

The distance measurement control unit 68 outputs a laser trigger for emitting the laser beam to the LD driver 34 in response to the count signal. The LD driver 34 drives the LD 30 to emit the laser beam in response to the laser trigger.

In the example shown in FIG. 6, a time during which the laser beam is emitted is tens of nanoseconds. A time during which the laser beam emitted to the subject far away from the emission unit 22 by several kilometers is received as the reflection laser beam by the PD 36 is “several kilometers×2/light speed”=several microseconds. Accordingly, for example, it takes a time of several microseconds as a minimum necessary time to measure the distance to the subject far away by several kilometers, as shown in FIG. 5.

In the present embodiment, for example, although a time during which the measurement is performed once is several milliseconds with consideration for a time during which the laser beam travels in a reciprocating motion as shown in FIG. 5, since the time during which the laser beam travels in the reciprocating motion varies depending on the distance to the subject, the measurement time per one time may varies depending on an assumed distance.

For example, in a case where the distance to the subject is derived based on the measurement values acquired through the measurement performed several hundreds of times in one measurement sequence, the distance measurement control unit 68 derives the distance to the subject by analyzing a histogram of the measurement values acquired through the measurement performed several hundreds of times.

For example, in the histogram of the measurement values acquired through the measurement performed several hundreds of times in one measurement sequence as shown in FIG. 7, a lateral axis represents the distance to the subject, and a longitudinal axis is the number of times the measurement is performed. The distance corresponding to the maximum value of the number of times the measurement is performed is derived as the distance measurement result by the distance measurement control unit 68. The histogram shown in FIG. 7 is merely an example, and the histogram may be generated based on the time during which the laser beam travels in the reciprocating motion (an elapsed time from when the laser beam is emitted to when the laser beam is received) and/or ½ of the time during which the laser beam travels in the reciprocating motion instead of the distance to the subject.

For example, the main control unit 62 includes the CPU 100 which is an example of a deriving unit and a control unit according to the technology of the present disclosure, as shown in FIG. 8. The main control unit 62 includes a primary storage unit 102 and a secondary storage unit 104. The CPU 100 controls the entire distance measurement device 10A. The primary storage unit 102 is a volatile memory used as a work area when various programs are executed. A RAM is used as an example of the primary storage unit 102. The secondary storage unit 104 is a non-volatile memory that previously stores various parameters and/or control programs for controlling the activation of the distance measurement device 10A. Electrically erasable programmable read only memory (EEPROM) and/or a flash memory are used as an example of the secondary storage unit 104. The CPU 100, the primary storage unit 102, and the secondary storage unit 104 are connected to each other through the busline 84.

Incidentally, the distance measurement device 10A has the dimension deriving function. For example, as shown in FIG. 9, the dimension deriving function refers to a function of deriving a length L of a region in a real space included in the subject based on addresses u1 and u2 of the designated pixels and a distance D measured by the distance measurement device 10A or deriving an area based on the length L. For example, the “designated pixels” refer to pixels of the imaging element 60 corresponding to two points designated by the user on the live view image. For example, the length L is derived from the following Expression (1). In Expression (1), p is a pitch between pixels included in the imaging element 60, u1 and u2 are addresses of the pixels designated by the user, and f is a focal length of the imaging lens 50.

[ Expression 1 ] L = D × { p ( u 1 - u 2 ) f } ( 1 )

Expression (1) is an expression used on the assumption that a target as a dimension deriving target is captured in a state in which the target faces the imaging lens 50 in front view. Accordingly, for example, in a case where the subject including the target as the dimension deriving target is captured in a state in which the target does not face the imaging lens 50 in front view, a projection conversion process is performed. For example, the projection conversion process refers to a process of converting the captured image acquired through the imaging and/or an image of a square portion of the captured image into a facing view image based on the square image included in the captured image by using the known technology such as affine transformation. The facing view image refers to an image in a state in the subject faces the imaging lens 50 in front view. The addresses u1 and u2 of the pixels of the imaging element 60 are designated through the facing view image, and the length L is derived from Expression (1).

As stated above, it is preferable that an in-image irradiation position is derived with high accuracy and is ascertained together with the distance by the user in order to accurately derive the length L of the region in the real space based on the addresses u1 and u2. The reason is that the derived length L is completely different from the actual length in a case where it is assumed that the in-image irradiation position and the irradiation position of the laser beam in the real space are positions on planes of which orientations and positions are different.

In a case where a colored laser beam of which an irradiation position is able to be visually perceived within a distance of about several meters from the distance measurement device 10A is used as the laser beam, the in-image irradiation position may be visually specified and designated from the captured image depending on a diameter and/or intensity of the laser beam. However, for example, in a case where a structure separated from a building site by several tens of meters or several hundreds of meters is irradiated with the laser beam in daytime, it is difficult to visually specify the in-image irradiation position from the captured image. A method of specifying the in-image irradiation position from a difference between a plurality of captured images acquired in a sequence of time is also considered. However, in a case where the structure separated from the building site by several tens of meters or several hundreds of meters is irradiated with the laser beam, it is difficult to specify the in-image irradiation position. In a case where the in-image irradiation position is not able to be specified, the user performs the distance measurement while the user does not recognize whether or not the subject assumed as the distance measurement target is irradiated with the laser beam.

For example, in the distance measurement device 10A, the secondary storage unit 104 stores a distance measurement program 106 and an irradiation position adjustment program 107, as shown in FIG. 8. The irradiation position adjustment program 107 is an example of a distance measurement control program according to the technology of the present disclosure.

For example, the CPU 100 is operated as a deriving unit 100A shown in FIG. 10 by reading the distance measurement program 106 out of the secondary storage unit 104, loading the readout distance measurement program into the primary storage unit 102, and executing the distance measurement program 106.

The deriving unit 100A acquires the correspondence relation between an in-provisional-image irradiation position and a distance which are provisionally measured by the distance measurement unit 12 and the distance measurement control unit 68 by using the laser beam corresponding to the in-provisional-image irradiation position. The deriving unit 100A derives an in-actual-image irradiation position, which corresponds to the irradiation position of the laser beam used in the actual measurement using the distance measurement unit 12 and the distance measurement control unit 68, within the actual image acquired by performing the actual imaging by the imaging device 14 based on the acquired correspondence relation. The in-provisional-image irradiation position refers to a position, which corresponds to the irradiation position of the laser beam onto the subject, within a provisional image acquired by performing the provisional imaging on the subject by the imaging device 14 whenever each of a plurality of distances is provisionally measured by the distance measurement unit 12 and the distance measurement control unit 68.

In the present embodiment, the irradiation-position pixel coordinates of the in-actual-image irradiation position, the in-provisional-image irradiation position, and an in-live-view-image irradiation position are derived by the CPU 100, and the in-image irradiation position is specified from the derived irradiation-position pixel coordinates. Hereinafter, the in-actual-image irradiation position, the in-provisional-image irradiation position, and the in-live-view-image irradiation position are simply referred to as the “in-image irradiation position” in a case where it is not necessary to distinguish between the in-actual-image irradiation position and the in-provisional-image irradiation position for the sake of convenience in description.

The in-live-view-image irradiation position means a position, which corresponds to the irradiation position of the laser beam used in the measurement, within the live view image acquired through the imaging using the imaging device 14. The in-live-view-image irradiation position is an example of the in-image irradiation position according to the present invention, and is derived by the same deriving method as the deriving method of the in-actual-image irradiation position described above.

For example, the CPU 100 is operated as the deriving unit 100A and a control unit 100B shown in FIG. 10 by reading the irradiation position adjustment program 107 out of the secondary storage unit 104, loading the readout irradiation position adjustment program into the primary storage unit 102, and executing the irradiation position adjustment program 107.

The deriving unit 100A derives the in-live-view-image irradiation position based on the distance measured by the distance measurement unit 12 and the distance measurement control unit 68 and an emission angle β to be described below.

In a case where the derived in-live-view-image irradiation position is out of a default range within the live view image, the control unit 100B performs predetermined control until the in-live-view-image irradiation position falls in the default range. The predetermined control means control for causing the distance measurement unit 12 and the distance measurement control unit 68 to measure the distance and causing the deriving unit 100A to derive the in-live-view-image irradiation position based on the distance measured by the distance measurement unit 12 and the distance measurement control unit 68 and the emission angle β.

Next, the actions of the distance measurement device 10A will be described.

Initially, a distance measurement process realized by executing the distance measurement program 106 in the CPU 100 in a case where a power switch of the distance measurement device 10A is turned on will be described with reference to FIGS. 11 to 13. Hereinafter, a case where the live view image is displayed on the display unit 86 will be described for the sake of convenience in description. Hereinafter, the irradiation position of the laser beam onto the subject in the real space is referred to as a “real-space irradiation position” for the sake of convenience in description.

Although it will be described below that an in-image irradiation position in an X direction which is a front-view left-right direction for the imaging surface of the imaging element 60 included in the imaging device 14 is derived for the sake of convenience in description, an in-image irradiation position in a Y direction which is a front-view upper-lower direction for the imaging surface of the imaging element 60 included in the imaging device 14 is similarly derived. As mentioned above, the in-image irradiation positions ultimately output by deriving the in-image irradiation positions in the X direction and the Y direction are expressed by two-dimensional coordinates.

Hereinafter, for the sake of convenience in description, the front-view left-right direction for the imaging surface of the imaging element 60 included in the imaging device 14 is referred to as the “X direction” or a “row direction”, and the front-view upper-lower direction for the imaging surface of the imaging element 60 included in the imaging device 14 is referred to as the “Y direction” or a “column direction”.

In the distance measurement process shown in FIG. 11, the deriving unit 100A initially determines whether or not a parameter changing factor occurs in step 200. The parameter changing factor refers to a factor for changing parameters that influence the in-image irradiation position.

In the present embodiment, the parameters refer to a half angle of view α, an emission angle β, and a inter-reference-point distance d, as shown in FIG. 15. The half angle of view α refers to half of the angle of view on the subject captured by the imaging device 14. The emission angle β refers to an angle at which the laser beam is emitted from the emission unit 22. The inter-reference-point distance d refers to a distance between a first reference point P1 prescribed for the imaging device 14 and a second base reference point P2 prescribed for the distance measurement unit 12. A main point of the imaging lens 50 is used as an example of the first reference point P1. A point previously set as an origin of coordinates capable of specifying a position of the distance measurement unit 12 in a three dimensional space is used as an example of the second reference point P2. Specifically, an end of front-view left and right ends of the object lens 38 or one vertex of a housing (not shown) of the distance measurement unit 12 in a case where the housing has a cuboid shape.

In the present embodiment, the parameter changing factor refers to, for example, replacement of the lens, the replacement of the distance measurement unit, a change in the angle of view, and a change in the emission direction. Thus, the determination result is positive in a case where at least one of the replacement of the lens, the replacement of the distance measurement unit, the change in the angle of view, and the change in the emission direction occurs in step 200.

The replacement of the lens refers to the replacement of only the imaging lens 50 of the lens unit 16 and the replacement of the lens unit 16 itself. The replacement of the distance measurement unit refers to the replacement of only the object lens 32 of the distance measurement unit 12, the replacement of only the object lens 38 of the distance measurement unit 12, and the replacement of the distance measurement unit 12 itself. The change in the angle of view refers to a change in the angle of view by the movement of the zoom lens 52 by pressing the angle-of-view instruction button. The change in the emission direction refers to a change in the direction in which the laser beam is emitted by the emission unit 22.

In a case where the parameter changing factor occurs in step 200, the determination result is positive, and the process proceeds to step 202.

For example, in step 202, the deriving unit 100A displays a first intention check screen 110 on the display unit 86 as shown in FIG. 16. Thereafter, the process proceeds to step 204.

The first intention check screen 110 is a screen for checking the user's intention of whether or not to display an irradiation position mark 116 (see FIG. 20) which is a mark indicating the in-actual-image irradiation position in a specifiable manner within a display region of the actual image. In the example shown in FIG. 16, a message of “do you display the irradiation position mark?” is displayed on the first intention check screen 110. In the example shown in FIG. 16, a soft key of “yes” designated for announcing an intention to display the irradiation position mark 116 and a soft key of “no” designated for announcing an intention not to display the irradiation position mark 116 are also displayed on the first intention check screen 110.

In step 204, the deriving unit 100A determines whether or not to display the irradiation position mark 116. In a case where the irradiation position mark 116 is displayed in step 204, that is, in a case where the soft key of “yes” of the first intention check screen 110 is pressed through the touch panel 88, the determination result is positive, and the process proceeds to step 208. In a case where the irradiation position mark 116 is not displayed in step 204, that is, in a case where the soft key of “no” of the first intention check screen 110 is pressed through the touch panel 88, the determination result is negative, and the process proceeds to step 290 shown in FIG. 12.

In step 290 shown in FIG. 12, the deriving unit 100A determines whether or not the actual measurement and actual imaging button 90A is turned on. In a case where the actual measurement and actual imaging button 90A is turned on in step 290, the determination result is positive, and the process proceeds to step 292.

In step 292, the deriving unit 100A performs the actual measurement by controlling the distance measurement control unit 68. The deriving unit 100A performs the actual imaging by controlling the imaging element driver 74 and the image signal processing circuit 76. Thereafter, the process proceeds to step 294.

In step 294, the deriving unit 100A displays the actual image which is the image acquired by performing the actual imaging and the distance acquired by performing the actual measurement on the display unit 86. Thereafter, the process proceeds to step 200 shown in FIG. 11.

Meanwhile, in a case where the actual measurement and actual imaging button 90A is not turned on in step 290, the determination result is negative, and the process proceeds to step 296.

In step 296, the deriving unit 100A determines whether or not an end condition which is a condition in which the actual distance measurement process is ended is satisfied. For example, in the present distance measurement process, the end condition refers to a condition in which an instruction to end the actual distance measurement process is received through the touch panel 88 and/or a condition in which a predetermined time (for example, one minute) elapses after the determination result in step 290 is negative.

In a case where the end condition is not satisfied in step 296, the determination result is negative, and the process proceeds to step 290. In a case where the end condition is satisfied in step 296, the determination result is positive, and the actual distance measurement process is ended.

Meanwhile, for example, the deriving unit 100A displays a provisional measurement and provisional imaging guide screen 112 on the display unit 86 as shown in FIG. 17 in step 208 shown in FIG. 11. Thereafter, the process proceeds to step 210.

In the actual distance measurement process, the process is performed in any operation mode of a first operation mode in which the provisional measurement and provisional imaging is performed and a second operation mode which is an operation mode other than the first operation mode. In other words, the operation mode other than the first operation mode means an operation mode different from the first operation mode, and refers to an operation mode in which the provisional measurement and the provisional imaging are not performed. In step 208, transition from the second operation mode to the first operation mode is displayed to the user by displaying the provisional measurement and provisional imaging guide screen 112. In the present embodiment, the processes of step 208 to 226 correspond to the process of the first operation mode, and the processes of the steps other than step 208 to 226 correspond to the process of the second operation mode.

The provisional measurement and provisional imaging guide screen 112 is a screen for guiding the user information indicating that the provisional measurement and the provisional imaging are performed multiple times (for example, three times in the present embodiment) while changing the emission direction of the laser beam. In the example shown in FIG. 17, a message of “please, perform the provisional measurement and provisional imaging three times while changing the emission direction of the laser beam” is displayed on the provisional measurement and provisional imaging guide screen 112.

In step 210, the deriving unit 100A determines whether or not the provisional measurement and provisional imaging button 90B is turned on. In a case where the provisional measurement and provisional imaging button 90B is not turned on in step 210, the determination result is negative, and the process proceeds to step 212. In a case where the provisional measurement and provisional imaging button 90B is turned on in step 210, the determination result is positive, and the process proceeds to step 214.

In step 212, the deriving unit 100A determines whether or not the end condition is satisfied. In a case where the end condition is not satisfied in step 212, the determination result is negative, and the process proceeds to step 210. In a case where the end condition is satisfied in step 212, the determination result is positive, and the actual distance measurement process is ended.

In step 214, the deriving unit 100A performs the provisional measurement by controlling the distance measurement control unit 68. The deriving unit 100A performs the provisional imaging by controlling the imaging element driver 74 and the image signal processing circuit 76. Thereafter, the process proceeds to step 216.

In step 216, the deriving unit 100A stores the provisional image which is the image acquired by performing the provisional imaging and the distance acquired by performing the provisional measurement in the primary storage unit 102. Thereafter, the process proceeds to step 218.

In step 218, the deriving unit 100A determines whether or not the provisional measurement and the provisional imaging are performed three times by determining whether or not the provisional measurement and provisional imaging button 90B is turned on three times. In a case where the provisional measurement and the provisional imaging are not performed three times in step 218, the determination result is negative, and the process proceeds to step 210. In a case where the provisional measurement and the provisional imaging are performed three times in step 218, the determination result is positive, and the process proceeds to step 220.

Subsequently, the CPU 100 determines whether or not the relation between a plurality of provisionally measured distances (for example, three distances) is not a predetermined relation satisfying that these distances do not effectively contribute to the construction (generation) of the correspondence information to be described below used in the deriving of the in-actual-image irradiation position. That is, in step 220, the deriving unit 100A determines whether or not the three distances stored in the primary storage unit 102 in step 216 are effective distances. The effective distances refer to distances having the relation satisfying that the three distances stored in the primary storage unit 102 effectively contribute to the construction (generation) of correspondence information to be described below in the deriving of the in-actual-image irradiation position. For example, the relation satisfying that distances effectively contribute to the construction of the correspondence information to be described below in the deriving of the in-actual-image irradiation position means a relation satisfying that the three distances are separated from each other by a predetermined distance or more (for example, 0.3 meters or more).

In a case where three distances stored in the primary storage unit 102 in step 216 are not effective distances in step 220, the determination result is negative, and the process proceeds to step 222. In a case where the three distances stored in the primary storage unit 102 in step 216 are effective distances in step 220, the determination result is positive, and the process proceeds to step 224.

For example, in step 222, the deriving unit 100A displays a re-performing guide screen 114 on the display unit 86 as shown in FIG. 18. Thereafter, the process proceeds to step 210.

The re-performing guide screen 114 is a screen for guiding the user the re-performing of the provisional measurement and the provisional imaging. In the example shown in FIG. 18, a message of “effective distances are not able to be measured. please, perform the provisional measurement and provisional imaging three times while changing the emission direction of the laser beam” is displayed on the re-performing guide screen 114.

In step 224, the deriving unit 100A specifies the in-provisional-image irradiation position for every provisional image stored in the primary storage unit 102 in step 216. Thereafter, the process proceeds to step 226. For example, the in-provisional-image irradiation position is specified from a difference between the image acquired before the provisional measurement and the provisional imaging are performed (for example, previous frame) in the live view image and the provisional image acquired by performing the provisional imaging. The user can visually recognize the irradiation position of the laser beam from the provisional image in a case where the distance at which the provisional measurement is about several meters. In this case, the irradiation position visually recognized from the provisional image may be designated by the user through the touch panel 88, and the designated position may be specified as the in-provisional-image irradiation position.

In step 226, the deriving unit 100A generates correspondence information which is an example of the above-described correspondence relation, and stores the generated correspondence information in the secondary storage unit 104 for every parameter changing factor. Thereafter, the process proceeds to step 228 shown in FIG. 13.

The correspondence information refers to information acquired by associating the in-provisional-image irradiation position with the distance of the distances stored in the primary storage unit 102 in step 216 which corresponds to the provisional image related to the in-provisional-image irradiation position for each in-provisional-image irradiation position specified in step 224.

For example, in the present embodiment, the correspondence information is stored as a correspondence table 98 in the secondary storage unit 104, as shown in FIG. 14. The correspondence table 98 is updated by storing the generated correspondence information whenever the correspondence information is generated in step 226. In the correspondence table 98, the correspondence information is associated with the parameter changing factor of which the occurrence is determined the in step 200. In the example shown in FIG. 14, the replacement of the lens, the replacement of the distance measurement unit, the change in the angle of view, and the change in the emission direction are used as an example of the parameter changing factor. (1), (2), and (3) shown in FIG. 14 are identification codes for identifying that these factors are parameter changing factors occurring in different timings.

Although three correspondence information items are associated with each of the replacement of the lens, the replacement of the distance measurement unit, and the change in the emission direction in the example shown in FIG. 14, the technology of the present disclosure is not limited thereto. For example, in a case where the parameter changing factor occurs once, the correspondence information items acquired by performing the provisional measurement and the provisional imaging multiple times for the parameter changing factor occurring once are associated with one parameter changing factor. For example, in a case where the provisional measurement and the provisional imaging are performed two times for the parameter changing factor occurring once, two correspondence information items are associated with one parameter changing factor.

In step 228, the deriving unit 100A determines whether or not the actual measurement and actual imaging button 90A is turned on. In a case where the actual measurement and actual imaging button 90A is turned on in step 228, the determination result is positive, and the process proceeds to step 230. In a case where the actual measurement and actual imaging button 90A is not turned on in step 228, the determination result is negative, and the process proceeds to step 244.

In step 230, the deriving unit 100A performs the actual measurement by controlling the distance measurement control unit 68. The deriving unit 100A performs the actual imaging by controlling the imaging element driver 74 and the image signal processing circuit 76. Thereafter, the process proceeds to step 232.

In step 232, the deriving unit 100A determines whether or not specific correspondence information is stored in the correspondence table 98. The specific correspondence information refers to the correspondence information of the correspondence information items acquired in the past which corresponds to the distance acquired by performing the actual measurement through the process in step 230.

For example, the correspondence information items acquired in the past refer to the correspondence information items which are associated with the corresponding parameter changing factor and are stored in the correspondence table 98. For example, the correspondence information corresponding to the distance acquired by performing the actual measurement refers to the correspondence information associated with a distance matching the distance which is acquired by performing the actual measurement within a predetermined error. For example, the predetermined error is a fixed value of ±0.1 meters, and the technology of the present disclosure is not limited thereto. The predetermined error may be a variable value changed according to an instruction of the user through the touch panel 88.

In a case where the specific correspondence information is not stored in the correspondence table 98 in step 232, the determination result is negative, and the process proceeds to step 234. In a case where the specific correspondence information is stored in the correspondence table 98 in step 232, the determination result is positive, and the process proceeds to step 236.

In step 234, the deriving unit 100A derives the parameter based on the latest correspondence information of the correspondence information items which are related to the corresponding parameter changing factor and are stored in the correspondence table 98, and associates the derived parameter with the latest correspondence information. Thereafter, the process proceeds to step 238. For example, the “latest correspondence information” refers to the correspondence information generated lately in step 226. The parameter derived in step 234 is an uncertain parameter in a current point of time, and varies for every parameter changing factor as represented in the following Table 1.

TABLE 1 parameter changing factor parameter replacement of lens half angle of view α, emission angle β replacement of distance emission angle β, inter-reference-point measurement unit distance d change in angle of view half angle of view α change in emission direction emission angle β

The number of uncertain parameters may be one to three. For example, in the example shown in Table 1, in a case where both the replacement of the distance measurement unit and the change in the angle of view are performed, the number of uncertain parameters is three such as the half angle of view α, the emission angle β, and the inter-reference-point distance d. In a case where only the replacement of the lens is performed, the number of uncertain parameters is two such as the half angle of view α and the emission angle β. In a case where only the replacement of the distance measurement unit is performed, the number of uncertain parameters is two such as the emission angle β, and the inter-reference-point distance d. In a case where only the change in the angle of view is performed, the number of uncertain parameters is one such as the half angle of view α. In a case where only the change in the emission direction is performed, the number of uncertain parameters is one such as the emission angle β.

For example, the parameters are derived from the following Expressions (2) to (4) in step 234. In Expressions (2) and (3), a distance D is a distance specified from the latest correspondence information, and distances specified from the latest correspondence information are distances D1, D2, and D3 in a case where the latest correspondence information is the correspondence information related to the change in the angle of view (1) in the example shown in FIG. 14. In Expression (4), “row-direction pixels of the irradiation positions” are in-image irradiation positions in a row direction, and “half of the number of row-direction pixels” is half of the number of pixels in the row direction in the imaging element 60. For example, in the present embodiment, the half angle of view α is derived from the following Expression (5). In Expression (5), “f” is a focal length. For example, it is preferable that the focal length f substituted into Expression (5) is a focal length used in the actual imaging of step 230.

[ Expression 2 ] Δ x = d - D cos β [ Expression 3 ] ( 2 ) X = D sin β tan α [ Expression 4 ] ( 3 ) ( row - direction pixel of irradiation position ) : ( half of number of row - direction pixels ) = Δ x : X [ Expression 5 ] ( 4 ) α = atan { ( dimension of imaging pixel ) 2 × f } ( 5 )

In step 234, the in-provisional-image irradiation positions specified from the latest correspondence information of the correspondence information items stored in the correspondence table 98 are the “row-direction pixels of the irradiation positions”. In the example shown in FIG. 14, in a case where the latest correspondence information is the correspondence information related to the change in the angle of view (1), the in-provisional-image irradiation positions specified from the latest correspondence information are X1, X2, and X3. The distance specified from the latest correspondence information of the correspondence information items stored in the correspondence table 98 are used as the distance D in Expressions (2) and (3) for every corresponding in-provisional-image irradiation position (corresponding “row-direction pixel of the irradiation position”). The parameter closest to each of the “row-direction pixels of the irradiation positions” is derived by the deriving unit 100A.

Now, an example in which a part of the correspondence table 98 shown in FIG. 14 is used in the deriving method of the parameter will be described. For example, in a case where the correspondence information items related to the change in the angle of view (1) and the replacement of the distance measurement unit (1) which are examples of the parameter changing factor are used as the latest correspondence information items, the latest correspondence information items are distances D1, D2, D3, D16, D17, and D18 and the in-provisional-image irradiation positions X1, X2, X3, X16, X17, and X18.

The in-provisional-image irradiation position X1 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D1 is used as the distance D in Expressions (2) and (3). The in-provisional-image irradiation position X2 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D2 is used as the distance D in Expressions (2) and (3). The in-provisional-image irradiation position X3 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D3 is used as the distance D in Expressions (2) and (3). The in-provisional-image irradiation position X16 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D16 is used as the distance D in Expressions (2) and (3). The in-provisional-image irradiation position X17 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D17 is used as the distance D in Expressions (2) and (3). The in-provisional-image irradiation position X18 is used as the “row-direction pixel of the irradiation position” in Expression (4), the distance D18 is used as the distance D in Expressions (2) and (3). The half angle of view α, the emission angle β, and the inter-reference-point distance d closest to the in-provisional-image irradiation positions X1, X2, X3, X16, X17, and X18 are derived from Expressions (2) to (4).

In step 236, the deriving unit 100A derives the parameter based on the specific correspondence information. Thereafter, the process proceeds to step 238. The parameter derived by in step 236 is a parameter associated with the specific correspondence information, and is, for example, a parameter associated with the correspondence information by performing the process of step 234 in the past.

The parameter derived in step 236 may be a parameter associated with the correspondence information by performing the process of step 234 in the past, and the deriving unit 100A may derive the parameter again by using Expressions (2) to (4) based on the specific correspondence information.

In step 238, the deriving unit 100A derives the in-actual-image irradiation position based on the parameter derived in step 234 or step 236. Thereafter, the process proceeds to step 240.

For example, the in-actual-image irradiation position is derived from Expressions (2) to (4) in step 238. That is, the parameter derived in step 234 or step 236 is substituted into Expressions (2) to (4), and the distance is substituted as the distance D into Expressions (2) to (4) by performing the actual measurement in step 230. Accordingly, the “row-direction pixel of the irradiation position” is derived as the in-actual-image irradiation position.

For example, in step 240, the deriving unit 100A displays the actual image, the distance, and the irradiation position mark 116 on the display unit 86 as shown in FIG. 20. Thereafter, the process proceeds to step 242.

The actual image displayed on the display unit 86 by performing the process of step 240 is an image acquired by performing the actual imaging in step 230.

The distance displayed on the display unit 86 by performing the process of step 240 is a distance acquired by performing the actual measurement in step 230.

The irradiation position mark 116 displayed on the display unit 86 by performing the process of step 240 is a mark indicating the in-actual-image irradiation position derived by performing the process of step 238.

In step 242, the deriving unit 100A determines whether or not the distance acquired by performing the actual measurement in step 230 is in a correspondence information distance range. A case where the distance acquired by performing the actual measurement in step 230 is not in the correspondence information distance range means that the distance acquired by performing the actual measurement in step 230 is out of the correspondence information distance range.

A case where the distance is in the correspondence information distance range means that the distance is within a range of the distance specified from the correspondence information used in step 234 or step 236. In contrast, a case where the distance is out of the correspondence information distance range means that the distance is not in the range of the distance specified from the correspondence information used in step 234 or step 236. The case where the distance is out of the correspondence information distance range is distinguished between a case where the distance is out of a first correspondence information distance range and a case where the distance is out of a second correspondence information distance range.

For example, in a case where the relation between distances D100, D101, and D102 specified from the correspondence information used in step 234 or step 236 “D100<D101<D102” as shown in FIG. 21, the case where the distance is in the correspondence information distance range and the case where the distance is out of the correspondence information distance range are defined as follows.

That is, in the example shown in FIG. 21, the case where the distance is in the correspondence information distance range corresponds to a range of the distance D100 or more and the distance D102 or less. The case where the distance is out of the first correspondence information distance range corresponds to a range of less than the distance D100. The case where the distance is out of the second correspondence information distance range corresponds to a range of more than the distance D102.

In a case the distance acquired by performing the actual measurement in step 230 is in the correspondence information distance range in step 242, the determination result is positive, and the process proceeds to step 244. In a case where the distance acquired by performing the actual measurement in step 230 is out of the correspondence information distance range in step 242, the determination result is negative, and the process proceeds to step 246.

For example, in step 246, the deriving unit 100A displays a warning and recommendation message 120 on the display unit 86 such that the alarm and recommendation message is superimposed on the actual image, as shown in FIG. 22. Thereafter, the process proceeds to step 248.

The warning and recommendation message 120 is a message for warning the user that there is a high possibility that the laser beam will not be applied to a position in the real space which corresponds to the position of the irradiation position mark 116 and recommending the provisional measurement and the provisional imaging to the user.

In the example shown in FIG. 22, a warning message of “the irradiation position mark has low accuracy (reliability)” is included in the warning and recommendation message 120. In the example shown in FIG. 22, a recommendation message of “it is recommended that the provisional measurement and the provisional imaging are performed in a range of oo meters to 44 meters” is included in the warning and recommendation message 120.

The “range of oo meters to ΔΔ meters” included in the recommendation message is a range out of the first correspondence information distance range or a range out of the second correspondence information distance range. That is, in a case where the distance acquired by performing the actual measurement in step 230 is out of the first correspondence information distance range, a specific range out of the first correspondence information distance range is employed. In a case where the distance acquired by performing the actual measurement in step 230 is out of the second correspondence information distance range, a specific range out of the second correspondence information distance range is employed.

The specific range means a range of the distance recommended in the provisional measurement based on the relation between the distance acquired by performing the actual measurement in step 230 and the correspondence information distance range. For example, the specific range is a range which is uniquely determined from a predetermined table or calculation expression depending on a degree of deviation of the distance acquired by performing the actual measurement in step 230 from a specific value in the correspondence information distance range. The specific value in the correspondence information distance range may be a center value or an average value in the correspondence information distance range. For example, the specific range out of the first correspondence information distance range may be a range which is uniquely determined depending on a difference between the distance D100 shown in FIG. 21 and the distance acquired by performing the actual measurement in step 230. For example, the specific range out of the second correspondence information distance range may be a range which is uniquely determined depending on a difference between the distance D102 shown in FIG. 21 and the distance acquired by performing the actual measurement in step 230. Instead of the “specific range”, a “plurality of default distances” may be used. For example, three or more distances separated from each other at equal spaces within the specific range acquired as described above may be used as the plurality of default distances, and a plurality of distances recommended in the provisional measurement may be used.

For example, although the warning and recommendation message 120 is presented to the user in step 246 by being visually displayed on the display unit 86, the technology of the present disclosure is not limited thereto. For example, the message may be presented to the user by being output as sound by a sound playback device (not shown) provided at the distance measurement device 10A, or may be displayed through visual display and audible indication.

For example, in step 248, the deriving unit 100A displays a second intention check screen 118 on the display unit 86 as shown in FIG. 19. Thereafter, the process proceeds to step 250.

The second intention check screen 118 is a screen for checking the user′ intention of whether or not to increase the accuracy of the irradiation position of the laser beam, that is, the accuracy of the irradiation position mark 116. In the example shown in FIG. 19, a message of “do you want to increase the accuracy of the irradiation position mark?” is displayed on the second intention check screen 118. In the example shown in FIG. 19, a soft key of “yes” designated for announcing an intention to increase the accuracy of the irradiation position mark 116 is displayed on the second intention check screen 118. In the example shown in FIG. 19, a soft key of “no” designated for announcing an intention not to increase the accuracy of the irradiation position mark 116 is displayed on the second intention check screen 118.

In step 250, the deriving unit 100A determines whether or not to increase the accuracy of the irradiation position mark 116. In a case where the accuracy of the irradiation position mark 116 is increased in step 250, that is, in a case where the soft key of “yes” of the second intention check screen 118 is pressed through the touch panel 88, the determination result is positive, and the process proceeds to step 208. In a case where the accuracy of the irradiation position mark 116 is not increased in step 250, that is, in a case where the soft key of “no” of the second intention check screen 118 is pressed through the touch panel 88, the determination result is negative, and the process proceeds to step 244.

Meanwhile, in a case where the parameter changing factor does not occur in step 200 shown in FIG. 11, the determination result is negative, and the process proceeds to step 252.

In step 252, the deriving unit 100A determines whether or not the correspondence information is stored in the correspondence table 98.

In a case where the correspondence information is not stored in the correspondence table 98 in step 252, the determination result is negative, and the process proceeds to step 200. In a case where the correspondence information is stored in the correspondence table 98 in step 252, the determination result is positive, and the process proceeds to step 228.

Meanwhile, the deriving unit 100A determines whether or not the end condition is satisfied in step 244 shown in FIG. 13. In a case where the end condition is not satisfied in step 244, the determination result is negative, and the process proceeds to step 200. In a case where the end condition is satisfied in step 244, the determination result is positive, and the actual distance measurement process is ended.

Next, the irradiation position adjustment process realized by executing the irradiation position adjustment program 107 in the CPU 100 in a case where the irradiation position adjustment button 90F is pressed in a state in which the live view image is displayed on the display unit 86 will be described with reference to FIG. 23.

Although it will be described below that the irradiation position of the laser beam in the X-direction is adjusted by operating the horizontal rotation mechanism 15 for the sake of convenience in description, the irradiation position of the laser beam in the Y direction is similarly adjusted. The adjustment of the irradiation position of the laser beam in the Y direction is realized by operating the longitudinal rotation mechanism 13. Hereinafter, a case where the live view image is displayed on the display unit 86 at a specific frame rate will be described for the sake of convenience in description.

In the irradiation position adjustment process shown in FIG. 23, the control unit 100B initially determines whether or not a default time comes in step 400. For example, the default time means a time whenever the live view image is displayed in three frames. The default time is not limited to the time whenever the live view image is displayed in three frames, and the number of frames in which the live view image is displayed may not be three, or may be prescribed by a predetermined time such as 3 seconds or 5 seconds. The default time may be a time previously determined according to an instruction received through the touch panel 88.

In a case where the default time comes in step 400, the determination result is positive, and the process proceeds to step 402. In a case where the default time does not come in step 400, the determination result is negative, and the process proceeds to step 416.

In step 402, the control unit 100B performs the measurement of the distance by controlling the distance measurement control unit 68. The control unit 100B performs the imaging by controlling the imaging element driver 74 and the image signal processing circuit 76. Thereafter, the process proceeds to step 404.

In step 404, the control unit 100B causes the deriving unit 100A to derive the in-live-view-image irradiation position based on the latest parameter. Therefore, the process proceeds to step 406. For example, the latest parameter is a parameter used in the deriving of the in-actual-image irradiation position in a case where the in-image irradiation position derived last before the process of step 404 is performed is the in-actual-image irradiation position derived by performing the process of step 238 (see FIG. 13). For example, in a case where the process of step 412 to be described below after the process of previous step 404 is performed, the latest parameter is a parameter other than an emission angle β of the parameters used in the deriving of the latest in-live-view-image irradiation position and an emission angle β updated in step 412.

For example, the in-live-view-image irradiation position is derived from Expressions (2) to (4) in step 404. That is, the latest parameter is substituted into Expressions (2) to (4), and the distance acquired by performing the measurement in step 402 is substituted as the distance D into Expressions (2) to (4). Accordingly, the “row-direction pixel of the irradiation position” is derived as the in-live-view-image irradiation position.

For example, as shown in FIGS. 25 and 26, the control unit 100B may control the display unit 86 to display an irradiation position mark 116A which is a mark indicating the in-live-view-image irradiation position derived by performing the process of step 404 in a display region of the live view image. Therefore, according to the distance measurement device 10A, the user can easily ascertain the latest in-live-view-image irradiation position compared to a case where the irradiation position mark 116A is not displayed.

In a case where the irradiation position mark 116A is displayed, the control unit 100B may control the display unit 86 to display such that the irradiation position mark 116A is turned on and off and the irradiation position mark 116 is not turned on and off in order to distinguish the irradiation position mark 116A from the irradiation position mark 116 shown in FIG. 20.

In step 406, the control unit 100B determines whether or not the in-live-view-image irradiation position derived by the deriving unit 100A by performing the process of step 404 is in the default range. For example, a case where the in-live-view-image irradiation position is in the default range means that the in-live-view-image irradiation position is present inside a circular frame 117 of which a radius from the center of the captured image is a predetermined length (for example, several millimeters in the present embodiment), as shown in FIG. 24. The frame 117 may be a frame surrounding a specific partial region in the display region of the captured image. Although it has been described in the present embodiment that the frame 117 is displayed in the display region of the captured image, the technology of the present disclosure is not limited thereto, and the frame 117 may not be displayed. The display and the non-display of the frame 117 performed by the display unit 86 may be selectively switched by the control unit 100B according to an instruction received through the touch panel 88.

In a case where the in-live-view-image irradiation position is out of the default range in step 406, the determination result is negative, and the process proceeds to step 408.

In step 408, the control unit 100B displays out-of-default-range information on the display unit 86 such that the out-of-default-range information is superimposed on the live view image. Therefore, the process proceeds to step 410. The out-of-default-range information is information indicating that the in-live-view-image irradiation position derived by the deriving unit 100A by performing the process of step 404 is out of the default range.

For example, as the out-of-default-range information, there is a message 119 of the “irradiation position of the laser beam is not present in the central portion of the image” displayed on the display unit 86, as shown in FIG. 25. This message 119 is merely an example. For example, in a case where the frame 117 is displayed, a message of the “frame is not irradiated with the laser beam” may be displayed as the out-of-default-range information on the display unit 86. For example, the message is not limited to be visually displayed on the display unit 86, and may be audibly presented by being as sound by a sound playback device (not shown). Permanent visual display using an image forming device (not shown) may be performed, or at least two of the visual display, the audible indication, or the permanent visual display may be performed.

As stated above, the out-of-default-range information is displayed by the display unit 86 by performing the process of step 408, and thus, notification indicating that the in-live-view-image irradiation position is out of the default range is presented to the user. That is, the display unit 86 is operated as the second notification unit according to the technology of the present disclosure by performing the process of step 408.

In step 410, the control unit 100B rotates the distance measurement unit 12 to a default direction by a default rotation amount (an example of a default amount according to the technology of the present disclosure) by controlling the horizontal rotation mechanism 15 through the motor driver 23. Thereafter, the process proceeds to step 412.

For example, the default rotation amount means a constant rotation amount. For example, the default rotation amount is a rotation amount needed to change the emission angle β by a predetermined angle (for example, 3 degrees).

The default direction is a direction in which a distance between the in-live-view-image irradiation position derived by the deriving unit 100A by performing the process of step 404 and the center of the frame 117 decreases. Thus, the default direction is uniquely determined from a relation between the in-live-view-image irradiation position derived by the deriving unit 100A by performing the process of step 404 and the center of the captured image which is the center of the frame 117.

In step 412, the control unit 100B updates the emission angle β according to the rotation direction and the rotation amount of the distance measurement unit 12 rotated by performing the process of step 410. Thereafter, the process proceeds to step 400.

In a case where the in-live-view-image irradiation position is in the default range in step 406, the determination result is positive, and the process proceeds to step 414.

In step 414, the control unit 100B displays in-default-range information on the display unit 86 such that the in-default-range information is superimposed on the live view image. Thereafter, the process proceeds to step 416. The in-default-range information is information indicating that the in-live-view-image irradiation position derived by the deriving unit 100A by performing the process of step 404 is in the default range.

For example, as the in-default-range information, there is a message 121 of the “irradiation position of the laser beam is present in the central portion of the image” displayed on the display unit 86, as shown in FIG. 26. This message 121 is merely an example. For example, in a case where the frame 117 is displayed, a message of the “frame is irradiated with the laser beam” may be displayed as the in-default-range information on the display unit 86. For example, the message is not limited to be visually displayed on the display unit 86, and may be audibly presented by being output as sound by a sound playback device (not shown). Permanent visual display using an image forming device (not shown) may be performed, or at least two of the visual display, the audible indication, or the permanent visual display may be performed.

As mentioned above, the in-default-range information is displayed on the display unit 86 by performing the process of step 414, and thus, notification indicating that the in-live-view-image irradiation position is in the default range is presented to the user. That is, the display unit 86 is operated by the first notification unit according to the technology of the present disclosure by performing the process of step 414.

In step 416, the control unit 100B determines whether or not an end condition which is a condition in which an actual irradiation position adjustment process is ended is satisfied. In the actual irradiation position adjustment process, the end condition is, for example, a condition in which the irradiation position adjustment button 90F is pressed again and/or a condition in which a predetermined time (for example, 1 minute) elapses after the performing of the actual irradiation position adjustment process is started.

In a case where the end condition is not satisfied in step 416, the determination result is negative, and the process proceeds to step 400. In a case where the end condition is satisfied in step 416, the determination result is positive, and the actual irradiation position adjustment process is ended.

As described above, in the distance measurement device 10A, in a case where the in-live-view-image irradiation position is out of the default range within the captured image (step 406: N), the measurement performed by the distance measurement control unit 68 is performed until the in-live-view-image irradiation position is positioned within the frame 117 (step 402). The in-live-view-image irradiation position is derived based on the distance measured by the distance measurement control unit 68 and the latest parameter including the latest emission angle β (step 404).

Therefore, according to the distance measurement device 10A, it is possible to perform the distance measurement in a state in which the in-live-view-image irradiation position is positioned within the frame 117.

In the distance measurement device 10A, in a case where the in-live-view-image irradiation position is out of the default range within the captured image, the measurement is performed by the distance measurement control unit 68, and the emission angle β is changed by the rotation mechanism by driving the motors 17 and 19 until the in-live-view-image irradiation position is positioned within the frame 117. The in-live-view-image irradiation position is derived based on the distance measured by the distance measurement control unit 68 and the latest parameter including the latest emission angle β.

Therefore, according to the distance measurement device 10A, it is possible to reduce an effort to position the in-live-view-image irradiation position within the frame 117 compared to a case where the emission angle β is changed without using the motors 17 and 19 and the rotation mechanism.

In the distance measurement device 10A, a power for changing the emission angle β to the default direction is generated by the motors 17 and 19 based on the positional relation between the latest in-live-view-image irradiation position and the center of the frame 117, and thus, the emission angle β is changed (steps 410 and 412).

Therefore, according to the distance measurement device 10A, it is possible to position the in-live-view-image irradiation position within the frame 117 with high accuracy compared to a case where the power for changing the emission angle β is not generated by the motors 17 and 19 regardless of the positional relation between the latest in-live-view-image irradiation position and the center of the frame 117.

In the distance measurement device 10A, the irradiation position adjustment process is performed for a period during which the live view image is displayed on the display unit 86.

Therefore, according to the distance measurement device 10A, it is possible to perform the distance measurement in a case where the in-live-view-image irradiation position is positioned within the frame 117 while referring to the state of the subject.

In the distance measurement device 10A, the captured image is displayed as the live view image, and the frame 117 is displayed in the display region of the live view image.

Therefore, according to the distance measurement device 10A, the user can easily ascertain the position of the frame 117 in the display region of the live view image compared to a case where the frame 117 is not displayed in the display region of the live view image.

In the distance measurement device 10A, in a case where the in-live-view-image irradiation position is positioned within the frame 117 (step 406: Y), the message 121 is displayed in the display region of the live view image (see FIG. 26).

Therefore, according to the distance measurement device 10A, the user can easily recognize that the in-live-view-image irradiation position is positioned within the frame 117 compared to a case where the notification indicating that the in-live-view-image irradiation position is positioned within the frame 117 is not presented.

In the distance measurement device 10A, in a case where the in-live-view-image irradiation position is out of the frame 117 (step 406: N), the message 119 is displayed in the display region of the live view image (see FIG. 25).

Therefore, according to the distance measurement device 10A, the user can easily recognize that the in-live-view-image irradiation position is positioned within the frame 117 compared to a case where the notification indicating that the in-live-view-image irradiation position is out of the frame 117 is not presented.

Although it has been described in the first embodiment that the control unit 100B acquires the direction in which the distance measurement unit 12 is rotated based on the positional relation between the latest in-live-view-image irradiation position and the center of the frame 117, the technology of the present disclosure is not limited thereto. For example, the control unit 100B may acquire the direction in which the distance measurement unit 12 is rotated based on the positional relation between the latest in-live-view-image irradiation position and one specific point of quadrant points of the frame 117. As stated above, the control unit 100B may acquire the direction in which the distance measurement unit 12 is rotated based on the positional relation between the latest in-live-view-image irradiation position and the frame 117.

Although it has been described in the first embodiment that the frame 117 is positioned in the central portion within the captured image, the frame 117 may be one end portion of both end portions within the captured image in a left-right direction, or may be one end of both ends within the captured image in an upper-lower direction. The position of the frame 117 may be fixed, and may be changed according to, for example, an instruction received through the touch panel 88. The size of the frame 117 may be fixed, and may be changed according to, for example, an instruction received through the touch panel 88.

Although it has been described in the first embodiment that the frame 117 has the circular shape, the technology of the present disclosure is not limited thereto, and may have, for example, a frame having another shape such as an oval shape, a square shape, or a triangular shape formed in a closed region.

Although it has been described in the first embodiment that the emission angle β is updated according to the rotation of the distance measurement unit 12, the technology of the present disclosure is not limited thereto, and the inter-reference-point distance d together with the emission angle β may also be updated. For example, in a case where the inter-reference-point distance d is updated, the in-live-view-image irradiation position may be derived based on the latest parameter including the updated inter-reference-point distance d in step 404 shown in FIG. 23.

Second Embodiment

Although it has been described in the first embodiment that the in-live-view-image irradiation position is derived regardless of a dissimilarity between the distances before and after the measurement is performed, it will be described in a second embodiment that whether or not to derive the in-live-view-image irradiation position depending on the dissimilarity between the distances before and after the measurement is performed. In the second embodiment, since the same constituent elements as the constituent elements described in the first embodiment will be assigned the same references, the description thereof will be omitted, and only portions different from those of the first embodiment will be described.

A distance measurement device 10B (see FIGS. 1 and 4) according to the second embodiment is different from the distance measurement device 10A in that an irradiation position adjustment process shown in FIG. 27 is performed instead of the irradiation position adjustment process shown in FIG. 23.

The distance measurement device 10B according to the second embodiment is different from the distance measurement device 10A in that an irradiation position adjustment program 132 instead of the irradiation position adjustment program 107 is stored in the secondary storage unit 104 (see FIG. 8).

Next, an irradiation position adjustment process which is realized as the action of the distance measurement device 10B by performing the irradiation position adjustment program 132 in the CPU 100 will be described with reference to FIG. 27. The same steps as those of the flowcharts shown in FIG. 23 will be assigned the same step numbers, and thus, the description thereof will be omitted. Hereinafter, for the sake of convenience in description, it will be described on the assumption that the process of step 238 of the distance measurement process described in the first embodiment is already performed.

The irradiation position adjustment process shown in FIG. 27 is different from the irradiation position adjustment process shown in FIG. 23 in that step 403 is provided between the step 402 and step 404.

In step 403, the control unit 100B derives a distance dissimilarity, and determines whether or not the derived distance dissimilarity exceeds a threshold value. For example, in a case where the process of step 404 is already performed, the distance dissimilarity is a dissimilarity between the distance used in the previous deriving task of the in-live-view-image irradiation position performed by the deriving unit 100A and the latest distance measured by performing the process of step 402.

In step 403, in a case where the process of step 404 is already performed, an absolute value of a difference between the distance used in the previous deriving task of the in-live-view-image irradiation position performed by the deriving unit 100A and the latest distance measured by performing the process of step 402 is used as an example of the distance dissimilarity.

For example, in a case where the process of step 404 is not performed yet, the distance dissimilarity is a dissimilarity between the distance used in the deriving of the in-actual-image irradiation position performed by the deriving unit 100A and the latest distance measured by performing the process of step 402.

In step 403, in a case where the process of step 404 is not performed yet, an absolute value of a difference between the distance used in the deriving of the in-actual-image irradiation position performed by the deriving unit 100A and the latest distance measured by performing the process of step 402 is used as the example of the distance dissimilarity.

Although the absolute value of the difference is used as the example of the distance dissimilarity, the technology of the present disclosure is not limited thereto. For example, in a case where the process of step 404 is not performed yet, a ratio of the latest distance measured by performing the process of step 402 to the distance used in the deriving of the in-actual-image irradiation position performed by the deriving unit 100A may be used as the distance dissimilarity. For example, in a case where the process of step 404 is already performed, a ratio of the latest distance measured by performing the process of step 402 to the distance used in the previous deriving task of the in-live-view-image irradiation position performed by the deriving unit 100A may be used as the distance dissimilarity.

In a case where the distance dissimilarity exceeds the threshold value in step 403, the determination result is positive, and the process proceeds to step 404. In a case where the distance dissimilarity is equal to or less than the threshold value in step 403, the determination result is negative, and the process proceeds to step 400.

As described above, in the distance measurement device 10B, the distance is Intermittently measured by performing the process of step 400 (step 402). In a case where the latest distance dissimilarity is equal to or greater than the threshold value (step 403: Y), the processes subsequent to step 404 are performed.

Therefore, according to the distance measurement device 10B, it is possible to easily to maintain the in-live-view-image irradiation position in the frame 117 compared to a case where the processes subsequent to step 404 are not performed in a case where the distance dissimilarity is equal to or greater than the threshold value.

Third Embodiment

Although it has been described in the second embodiment that the in-live-view-image irradiation position is able to be adjusted under the condition in which the default time comes, it will be described in a third embodiment that the in-live-view-image irradiation position is able to be adjusted under the condition in which the release button is half pressed. In the third embodiment, since the same constituent elements as the constituent elements described in the first and second embodiments will be assigned the same references, the description thereof will be omitted, and only portions different from those of the first and second embodiments will be described.

A distance measurement device 10C (see FIGS. 1 and 4) according to the third embodiment is different from the distance measurement device 10B in that an irradiation position adjustment process shown in FIG. 28 is performed instead of the irradiation position adjustment process shown in FIG. 27.

The distance measurement device 10C according to the third embodiment is different from the distance measurement device 10B in that an irradiation position adjustment program 134 instead of the irradiation position adjustment program 132 is stored in the secondary storage unit 104 (see FIG. 8).

Next, an irradiation position adjustment process which is realized as the action of the distance measurement device 10C by performing the irradiation position adjustment program 134 in the CPU 100 will be described with reference to FIG. 28. The same steps as those of the flowcharts shown in FIG. 27 will be assigned the same step numbers, and thus, the description thereof will be omitted.

The irradiation position adjustment process shown in FIG. 28 is different from the irradiation position adjustment process shown in FIG. 27 in that step 450 is provided instead of step 400.

In step 450, the control unit 100B determines whether or not the release button is in the half pressed state. In a case where the release button is in the half pressed state in step 450, the determination result is positive, and the process proceeds to step 402. In a case where the release button is not in the half pressed state in step 450, the determination result is negative, and the process proceeds to step 416.

As described above, in the distance measurement device 10C, in a case where the release button is in the half pressed state (step 450: Y), the processes subsequent to step 402 are performed.

Therefore, according to the distance measurement device 10C, it is possible to prevent the in-live-view-image irradiation position from entering a state in which the in-live-view-image irradiation position is not positioned within the frame 117 at the time of the actual exposing compared to a case where the processes subsequent to step 402 are not performed in a case where the release button is in the half pressed state.

Fourth Embodiment

Although it has been described in the first to third embodiments that the distance measurement unit 12 is rotated by activating the rotation mechanism by the power generated by the motors 17 and 19, it will be described in a fourth embodiment that the distance measurement unit 12 is manually rotated. In the fourth embodiment, since the same constituent elements as the constituent elements described in the first to third embodiments will be assigned the same references, the description thereof will be omitted, and only portions different from those of the first to third embodiments will be described.

For example, as shown in FIG. 29, a distance measurement device 10D according to the fourth embodiment is different from the distance measurement device 10C in that the imaging device 139 instead of the imaging device 14 is provided. The imaging device 139 is different from the imaging device 14 in that an imaging device main body 180 instead of the imaging device main body 18 is provided. The imaging device main body 180 is different from the imaging device main body 18 in that a rotary encoder 25 is provided instead of the motor 17 and the motor driver 21. The distance measurement device 10D according to the fourth embodiment is different from the distance measurement device 10C in that a rotary encoder 27 is provided instead of the motor 19 and the motor driver 23.

The rotary encoder 25 is connected to the longitudinal rotation mechanism 13 and the busline 84, and detects the rotation direction and the rotation amount of the hot shoe 20 rotated by the longitudinal rotation mechanism 13. The main control unit 62 acquires the rotation direction and the rotation amount detected by the rotary encoder 25. The rotary encoder 27 is connected to the horizontal rotation mechanism 15 and the busline 84, and detects the rotation direction and the rotation amount of the hot shoe 20 rotated by the horizontal rotation mechanism 15. The main control unit 62 acquires the rotation direction and the rotation amount detected by the rotary encoder 27.

A distance measurement device 10D according to the fourth embodiment is different from the distance measurement device 10C in that an irradiation position adjustment process shown in FIG. 30 is performed instead of the irradiation position adjustment process shown in FIG. 28.

The distance measurement device 10D according to the fourth embodiment is different from the distance measurement device 10C in that an irradiation position adjustment program 136 instead of the irradiation position adjustment program 134 is stored in the secondary storage unit 104 (see FIG. 8).

Next, an irradiation position adjustment process which is realized as the action of the distance measurement device 10D by performing the irradiation position adjustment program 136 in the CPU 100 will be described with reference to FIG. 30. The same steps as those of the flowcharts shown in FIG. 28 will be assigned the same step numbers, and thus, the description thereof will be omitted. Hereinafter, for the sake of convenience in description, it will be assumed that the distance measurement unit 12 is not rotated by the power of the motors 17 and 19, the rotation mechanism is manually activated, and the distance measurement unit 12 is rotated according to the rotation operation of the rotation mechanism.

The irradiation position adjustment process shown in FIG. 30 is different from the irradiation position adjustment process shown in FIG. 28 in that step 460 is provided instead of step 410 and step 462 is provided instead of step 412.

In step 460, the control unit 100B determines whether or not the distance measurement unit 12 is rotated. In a case where the distance measurement unit 12 is not rotated in step 460, the determination result is negative, and the process proceeds to step 416. In a case where the distance measurement unit 12 is rotated in step 460, the determination result is positive, and the process proceeds to step 462.

In step 462, the control unit 100B updates the emission angle β according to the rotation direction and the rotation amount of the distance measurement unit 12. Thereafter, the process proceeds to step 450.

As described above, in the distance measurement device 10D, in a case where the distance measurement unit 12 is manually rotated and the in-live-view-image irradiation position is out of the frame 117, the distance is measured by the distance measurement control unit 68 until the in-live-view-image irradiation position is positioned within the frame 117. The in-live-view-image irradiation position is derived by the deriving unit 100A based on the measured distance and the emission angle β.

Therefore, according to the distance measurement device 10D, it is possible to easily reflect an intention of the user on the change of the emission angle β compared to a case where the distance measurement unit 12 is not able to be manually rotated.

Fifth Embodiment

Although it has been described in the first embodiment that the distance measurement device 10A is realized by the distance measurement unit 12 and the imaging device 14, a distance measurement device 10E realized by the distance measurement unit 12, an imaging device 140, and a smart device 142 will be described in a fifth embodiment.

In the fifth embodiment, since the same constituent elements as those of the above-described embodiments will be assigned the same references, the description thereof will be omitted, and only portions different from those of the above-described embodiments will be described. Hereinafter, the distance measurement program and the irradiation position adjustment programs are referred to as the “program” for the sake of convenience in description in a case where it is not necessary to distinguish between the distance measurement program 106 and the irradiation position adjustment programs 107, 132, 134, and 136. Hereinafter, the irradiation position adjustment programs are referred to as the “irradiation position adjustment program” without being assigned the references for the sake of convenience in description in a case where it is not necessary to distinguish between the irradiation position adjustment programs 107, 132, 134, and 136.

For example, as shown in FIG. 31, the distance measurement device 10E according to the fifth embodiment is different from the distance measurement device 10A according to the first embodiment in that the imaging device 140 instead of the imaging device 14 is provided. The distance measurement device 10E is different from the distance measurement device 10A in that the smart device 142 is provided.

The imaging device 140 is different from the imaging device 14 in that an imaging device main body 143 instead of the imaging device main body 18 is provided.

The imaging device main body 143 is different from the imaging device main body 18 in that a wireless communication unit 144 and a wireless communication antenna 146 are provided.

The wireless communication unit 144 is connected to the busline 84 and the wireless communication antenna 146. The main control unit 62 outputs transmission target information which is information of a target transmitted to the smart device 142 to the wireless communication unit 144.

The wireless communication unit 144 transmits, as a radio wave, the transmission target information input from the main control unit 62 to the smart device 142 through the wireless communication antenna 146. In a case where a radio wave from the smart device 142 is received by the wireless communication antenna 146, the wireless communication unit 144 acquires a signal corresponding to the received radio wave, and outputs the acquired signal to the main control unit 62.

The smart device 142 includes a CPU 148, a primary storage unit 150, and a secondary storage unit 152. The CPU 148, the primary storage unit 150, and the secondary storage unit 152 are connected to a busline 162.

The CPU 148 controls the entire distance measurement device 10E including the smart device 142. The primary storage unit 150 is a volatile memory used as a work area in a case where various programs are executed. A RAM is used as an example of the primary storage unit 150. The secondary storage unit 152 is a non-volatile memory that stores various parameters or control programs for controlling the entire activation of the distance measurement device 10E including the smart device 142. A flash memory or an EEPROM are used as an example of the secondary storage unit 152.

The smart device 142 includes a display unit 154, a touch panel 156, a wireless communication unit 158, and a wireless communication antenna 160.

The display unit 154 is connected to the busline 162 through a display control unit (not shown), and displays various information items under the control of the display control unit. For example, the display unit 154 is realized by a LCD.

The touch panel 156 is layered on a display screen of the display unit 154, and senses touch using a pointer. The touch panel 156 is connected to the busline 162 through a touch panel I/F (not shown), and outputs positional information indicating a position touched by the pointer. The touch panel I/F activates the touch panel according to an instruction of the CPU 148, and the touch panel I/F outputs the positional information input from the touch panel 156 to the CPU 148.

The soft keys corresponding to the actual measurement and actual imaging button 90A, the provisional measurement and provisional imaging button 90B, the imaging system operation mode switching button 90C, the wide angle instruction button 90D, the telephoto instruction button 90E, and the irradiation position adjustment button 90F which are described above are displayed on the display unit 154.

For example, as shown in FIG. 32, an actual measurement and actual imaging button 90A1 functioning as the actual measurement and actual imaging button 90A is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

For example, a provisional measurement and provisional imaging button 90B1 functioning as the provisional measurement and provisional imaging button 90B is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

For example, an imaging system operation mode switching button 90C1 functioning as the imaging system operation mode switching button 90C is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

For example, a wide angle instruction button 90D1 functioning as the wide angle instruction button 90D is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

For example, a telephoto instruction button 90E1 functioning as the telephoto instruction button 90E is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

For example, an irradiation position adjustment button 90F1 functioning as the irradiation position adjustment button 90F is displayed as a soft key on the display unit 154, and is pressed by the user through the touch panel 156.

The wireless communication unit 158 is connected to the busline 162 and the wireless communication antenna 160. The wireless communication unit 158 transmits, as a radio wave, a signal input from the CPU 148 to the imaging device main body 143 through the wireless communication antenna 160. In a case where a radio wave from the imaging device main body 143 is received by the wireless communication antenna 160, the wireless communication unit 158 acquires a signal corresponding to the received radio wave, and outputs the acquired signal to the CPU 148. Accordingly, the imaging device main body 143 is controlled by the smart device 142 by performing wireless communication with the smart device 142.

The secondary storage unit 152 stores a program. The CPU 148 reads the program out of the secondary storage unit 152, loads the readout program into the primary storage unit 150, and executes the distance measurement program. Thus, the distance measurement process described in the first embodiment is realized.

The CPU 148 reads the irradiation position adjustment program out of the secondary storage unit 152, loads the readout irradiation position adjustment program into the primary storage unit 150, and executes the irradiation position adjustment program. Thus, the irradiation position adjustment process described in the first to fourth embodiments is realized.

As described above, in the distance measurement device 10E, the correspondence information acquired by associating the in-provisional-image irradiation position with the distance which corresponds to the in-provisional-image irradiation position and is provisionally measured by using the laser beam is acquired by the CPU 148 of the smart device 142 whenever each of the plurality of distances is provisionally measured. The in-actual-image irradiation position is derived based on the acquired correspondence information by the CPU 148 of the smart device 142. Therefore, according to the distance measurement device 10E, it is possible to derive the in-actual-image irradiation position with high accuracy compared to a case where the actual measurement and the actual imaging are performed without performing the provisional measurement and the provisional imaging. In the distance measurement device 10E, the CPU 148 is operated as the deriving unit 100A and the control unit 100B by executing the irradiation position adjustment program (see FIG. 10). Therefore, according to the distance measurement device 10E, it is possible to acquire the same effects as the effects acquired by performing the irradiation position adjustment process described in the first to fourth embodiments.

According to the distance measurement device 10E, it is possible to reduce a load applied to the imaging device 140 in acquiring the effects described in the above-described embodiments compared to a case where the distance measurement process and the irradiation position adjustment process are performed by the imaging device 140.

Although it has been described in the above-described embodiments that the program is read out of the secondary storage unit 104 (152), it is not necessary to store the distance measurement program in the secondary storage unit 104 (152) from the beginning. For example, as shown in FIG. 33, the program may be stored in an arbitrary portable storage medium 500 such as a solid state drive (SSD) or a universal serial bus (USB) memory. In this case, the program stored in the storage medium 500 is installed on the distance measurement device 10A (10B, 10C, 10D, or 10E), and the installed distance measurement program is executed by the CPU 100 (148).

The program may be stored in a storage unit of another computer or a server device connected to the distance measurement device 10A (10B, 10C, 10D, or 10E) through a communication network (not shown), or the program may be downloaded according to a request of the distance measurement device 10A (10B, 10C, 10D, or 10E). In this case, the downloaded distance measurement program is executed by the CPU 100 (148).

Although it has been described in the above-described embodiments that various information items such as the actual image, the provisional image, the distance, the in-actual-image irradiation position, and the provisional measurement and provisional imaging guide screen 112 are displayed on the display unit 86 (154), the technology of the present disclosure is not limited thereto. For example, various information items may be displayed on a display unit of an external device used while being connected to the distance measurement device 10A (10B, 10C, 10D, or 10E). A personal computer or an eyeglass type or wristwatch type wearable terminal device is used as an example of the external device.

Although it has been described in the above-described embodiments that various information items are visually displayed by the display unit 86 (154), the technology of the present disclosure is not limited thereto. For example, audible indication of an output of sound from a sound playback device may be audibly displayed or a permanent visual display of an output of a printed article from a printer may be performed instead of the visual display. Alternatively, at least two of the visual display, the audible indication, or the permanent visual display may be performed.

Although it has been described in the above-described embodiments that the first intention check screen 110, the provisional measurement and provisional imaging guide screen 112, the re-performing guide screen 114, the irradiation position marks 116 and 116A, the frame 117, the second intention check screen 118, and the messages 119 and 121 are displayed on the display unit 86 (154), the technology of the present disclosure is not limited thereto. For example, the first intention check screen 110, the provisional measurement and provisional imaging guide screen 112, the re-performing guide screen 114, the second intention check screen 118, and the messages 119 and 121 may be displayed on a display unit (not shown) different from the display unit 86 (154), the irradiation position marks 116 and 116A, and the frame 117 may be displayed on the display unit 86 (154). Only at least one of the first intention check screen 110, the provisional measurement and provisional imaging guide screen 112, the re-performing guide screen 114, the irradiation position marks 116 and 116A, the frame 117, the second intention check screen 118, or the messages 119 and 121 may be displayed on a display unit different from the display unit 86 (154). The first intention check screen 110, the provisional measurement and provisional imaging guide screen 112, the re-performing guide screen 114, the irradiation position marks 116 and 116A, the frame 117, the second intention check screen 118, and the messages 119 and 121 may be individually displayed on a plurality of display units including the display unit 86 (154).

Although it has been described in the first embodiment that a power for changing the emission angle β by the default rotation amount is used as the power for changing the emission angle β, a rotation amount (for example, a rotation amount changed for every time) other than the default rotation amount may be used as the power for changing the emission angle β. However, it is preferable that the power for changing the emission angle β is a power for changing the emission angle β by the default rotation amount.

As stated above, according the distance measurement device 10A, since the power for changing the emission angle β by the default rotation amount is generated by the motors 17 and 19, easy control is realized compared to a case where the power for changing the emission angle β by a rotation amount other than the default rotation amount is generated by the motors 17 and 19.

In the first embodiment, a default rotation amount determined such that a movement amount of the in-live-view-image irradiation position is less than an outer diameter of the frame 117 may be used. Therefore, according to the distance measurement device 10A, it is possible to prevent the irradiation position mark 116A from exceeding the frame 117 compared to a case where the emission angle β is changed by the default rotation amount determined such that the movement amount of the in-live-view-image irradiation position is equal to or greater than the outer diameter of the frame 117.

Although it has been described in the above-described embodiments that the laser beam is used as the light for distance measurement, the technology of the present disclosure is not limited thereto. Directional light which is light having directivity may be used. For example, the measurement light may be directional light acquired by light emitting diode (LED) or a super luminescent diode (SLD). It is preferable that the directivity of the directional light is directivity having the same degree as that of the directivity of the laser beam. For example, it is preferable that the directivity of the directional light is directivity capable of being used in the distance measurement in a range of several meters to several kilometers.

The distance measurement process and the irradiation position adjustment process described in the above-described embodiments are merely examples. Accordingly, an unnecessary step may be removed, a new step may be added, or a process procedure may be switched without departing from the gist. The processes included in the distance measurement process may be realized by only the hardware configuration such as ASIC, or may be realized by the combination of the software configuration and the hardware configuration using the computer.

The disclosures of Japanese Patent Application No. 2015-171421 filed on Aug. 31, 2015 are hereby incorporated by reference in their entireties.

All documents, patent applications, and technical standards described in the present specification are herein incorporated by reference to the same extent as if such individual document, patent application, and technical standard were specifically and individually indicated to be herein incorporated by reference.

The above-described embodiments are further disclosed in the following appendix.

(Appendix 1)

A distance measurement device comprises an imaging unit that images a subject image indicating a subject, a measurement unit that measures a distance to the subject by emitting directional light which is light having directivity to the subject and receiving reflection light of the directional light, a change unit that is capable of changing an angle at which the directional light is emitted, a deriving unit that derives an in-image irradiation position, which corresponds to an irradiation position of the directional light onto the subject which is used in measurement performed by the measurement unit, within a captured image acquired by imaging the subject by the imaging unit based on the angle and the distance measured by the measurement unit, and a control unit that controls the measurement unit to measure the distance, and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change unit, until the in-image irradiation position falls in a default range within the captured image in a case where the in-image irradiation position is out of the default range.

Claims

1. A distance measurement device comprising:

an image sensor that images a subject;
a measurement unit that includes a light emitter and a light receiver and that measures a distance to the subject by the light emitter and the light receiver, the light emitter emitting directional light, which is light having directivity toward the subject, and the light receiver receiving reflection light of the directional light;
a change mechanism including a motor that independently moves the measurement unit with respect to the image sensor and is capable of changing an angle at which the directional light is emitted; and
a processor that is configured to function as:
a deriving unit that derives an in-image irradiation position, which corresponds to an irradiation position of the directional light onto the subject that is used in measurement performed by the measurement unit, within a captured image acquired by imaging the subject by the image sensor based on the angle and the distance measured by the measurement unit; and
a control unit that moves the in-image irradiation position within the captured image and controls the measurement unit to measure the distance, in a case in which the in-image irradiation position is out of a default range, the default range being a specific region in a display region of the captured image.

2. The distance measurement device according to claim 1,

wherein the control unit controls the measurement unit to measure the distance, controls the change mechanism to change the angle by driving a power source, and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change mechanism, until the in-image irradiation position falls in the default range, in a case where the in-image irradiation position is out of the default range.

3. The distance measurement device according to claim 2,

wherein the control unit controls the power source to generate power for causing the change mechanism to change the angle, in a direction in which a distance between the in-image irradiation position and the default range decreases based on a positional relation between the latest in-image irradiation position and the default range.

4. The distance measurement device according to claim 1,

wherein
the change mechanism includes a rotation mechanism that changes the angle by manually rotating at least the emission unit of the measurement unit.

5. The distance measurement device according to claim 1,

wherein the control mechanism performs the control for a period during which a plurality of captured images acquired by continuously imaging the subject by the imaging unit in a sequence of time is continuously displayed on a first display unit.

6. The distance measurement device according to claim 1,

wherein the control unit controls the measurement unit to intermittently measure the distance, in a case in which a dissimilarity between a distance used in the deriving of the in-image irradiation position performed in a previous stage by the deriving unit and a latest distance measured by the measurement unit is equal to or greater than a threshold value.

7. The distance measurement device according to claim 1,

wherein the control unit controls a second display unit to display the captured image, and causes the latest in-image irradiation position derived by the deriving unit to be displayed so as to be specified in a display region of the captured image.

8. The distance measurement device according to claim 1,

wherein the control unit controls a third display unit to display the captured image, and causes the default range to be displayed so to be specified in a display region of the captured image.

9. The distance measurement device according to claim 1,

wherein the control unit controls a first notification unit to notify that the in-image irradiation position is within the default rage in a case in which the in-image irradiation position is within the default range.

10. The distance measurement device according to claim 1,

wherein the control unit controls a second notification unit to notify that the in-image irradiation position is out of the default range in a case in which the in-image irradiation position is out of the default range.

11. The distance measurement device according to claim 1,

the control unit that moves the in-image irradiation position within the captured image by the change mechanism and controls the measurement unit to measure the distance until the in-image irradiation position falls in a default range within the captured image.

12. A distance measurement device comprising:

an image sensor that images a subject;
a measurement unit that includes a light emitter and a light receiver and that measures a distance to the subject by the light emitter and the light receiver, the light emitter emitting directional light, which is light having directivity toward the subject, and the light receiver receiving reflection light of the directional light;
a change mechanism including a motor that is capable of changing an angle at which the directional light is emitted; and
a processor the is configured to function as:
a deriving unit that derives an in-image irradiation position, which corresponds to an irradiation position of the directional light onto the subject that is used in measurement performed by the measurement unit, within a captured image acquired by imaging the subject by the image sensor based on the angle and the distance measured by the measurement unit; and
a control unit that controls the measurement unit to measure the distance, and controls the deriving unit to derive the in-image irradiation position based on the distance measured by the measurement unit and the angle changed by the change mechanism, until the in-image irradiation position falls in a default range within the captured image, in a case in which the in-image irradiation position is out of the default range,
a performing unit that performs at least one of focus adjustment or exposure adjustment on the subject,
wherein the control unit performs the control in a case in which the imaging preparation instruction, which causes the performing unit to start to perform at least one of the focus adjustment or the exposure adjustment before actual exposing is performed by the image sensor, is received.

13. A control method for distance measurement, comprising:

deriving an in-image irradiation position, which corresponds to an irradiation position of directional light, which is light having directivity on to a subject that is used in measurement performed by a processor that measures a distance to the subject by causing a light emitter to emit the directional light to the subject and a light receiver to receive reflection light of the directional light, within a captured image acquired by imaging the subject by an image sensor that images the subject, based on the distance measured by the processor, the image sensor, and the processor being included in a distance measurement device;
changing an angle changed by a change mechanism including a motor that separately moves the processor with respect to the image sensor and is capable of changing the angle at which the directional light is emitted;
moving the in-image irradiation position within the captured image by the change mechanism, in a case where the in-image irradiation position is out of the default range, the default range being a specific region in a display region of the captured image; and
controlling the processor to measure the distance after the in-image irradiation position moves within the captured image.
Referenced Cited
U.S. Patent Documents
5673082 September 30, 1997 Wells et al.
9377303 June 28, 2016 Giger
20020060784 May 23, 2002 Pack
20060290781 December 28, 2006 Hama
20090138233 May 28, 2009 Kludas
20130179119 July 11, 2013 Coddington
20160103209 April 14, 2016 Masuda et al.
Foreign Patent Documents
H06-147844 May 1994 JP
2007-010346 January 2007 JP
2014-232095 December 2014 JP
Other references
  • International Search Report issued in International Application No. PCT/JP2016/063582 dated Jul. 12, 2016.
  • Written Opinion of the ISA issued in International Application No. PCT/JP2016/063582 dated Jul. 12, 2016.
  • International Preliminary Report on Patentability issued in International Application No. PCT/JP2016/063582 dated Oct. 3, 2016.
Patent History
Patent number: 11047980
Type: Grant
Filed: Feb 26, 2018
Date of Patent: Jun 29, 2021
Patent Publication Number: 20180180737
Assignee: FUJIFILM CORPORATION (Tokyo)
Inventor: Tomonori Masuda (Saitama)
Primary Examiner: Mark Hellner
Application Number: 15/904,455
Classifications
Current U.S. Class: Instrument Condition Testing Or Indicating (356/6)
International Classification: G01C 3/08 (20060101); G01S 17/08 (20060101); G01S 7/497 (20060101); G01S 7/481 (20060101); G03B 13/20 (20210101); H04N 5/232 (20060101); G01S 17/86 (20200101);