Abstract: A focus detection device includes: an imaging unit having a first and second pixel each of which receives light transmitted through an optical system and outputs signal used for focus detection, and a third pixel which receives light transmitted through the optical system and outputs signal used for image generation; an input unit to which information regarding the optical system is input; a selection unit that selects one of the first and second pixel based on the information to the input unit; a readout unit that reads out the signal from one of the first and second pixel based on a selection result at a timing different from reading out the signal from the third pixel to be read out; and a focus detection unit that performs the focus detection based on at least one of the signals of the first and second pixel read out by the readout unit.
Type:
Application
Filed:
March 16, 2023
Publication date:
July 6, 2023
Applicant:
NIKON CORPORATION
Inventors:
Sinsuke SAMBONGI, Akira KINOSHITA, Yuki KITA
Abstract: A blur correction device includes a blur correction element that is movable to correct blurring of a subject image formed via an image-capturing optical system, a motion sensor that detects a motion of a device equipped with the blur correction device, an input unit to which information indicating a position on an image sensor that captures the subject image is input, and a drive unit that moves the blur correction element based on the position and the motion. In a case where the position is different, the direction of the blur correction element moved based on the motion is different.
Abstract: A camera body at which one or more camera accessories, at which light from a subject enters, are detachably mountable, includes: a body-side terminal; and a communication unit that communicates with at least one of the camera accessories, wherein: the communication unit requests, via the body-side terminal, the camera accessory for a discriminating signal indicating whether or not another camera accessory capable of communicating with the camera body is mountable on a subject side of the camera accessory.
Abstract: A processing apparatus is equipped with: a first stage system that has a table on which a workpiece is placed and moves the workpiece held by the table; a beam irradiation system that includes a condensing optical system to emit beams; and a controller to control the first stage system and the beam irradiation system, and processing is performed to a target portion of the workpiece while the table and the beams from the condensing optical system are relatively moved, and at least one of an intensity distribution of the beams at a first plane on an exit surface side of the condensing optical system and an intensity distribution of the beams at a second plane whose position in a direction of an optical axis of the condensing optical system is different from the first plane can be changed.
Abstract: Provided is ITO particles satisfying a relationship expressed in Expression (1) given below. 16×S/P2?0.330 . . . (1) (In the expression, S indicates a particle area in a TEM photographed image, and P indicates a perimeter of the particle).
Abstract: An image sensor includes: a first readout circuit that reads out a first signal, being generated by an electric charge resulting from a photoelectric conversion, to a first signal line; a first holding circuit that holds a voltage based on an electric current from a power supply circuit; and a first electric current source that supplies the first signal line with an electric current generated by the voltage held in the first holding circuit, wherein: the first holding circuit holds the voltage based on the electric current from the power supply circuit when the first signal is not read out to the first signal line by the first readout circuit.
Abstract: An electronic device includes: a display control unit that is configured to display at a display unit a first image generated by capturing an image with light having entered a first area of an image sensor and a second image generated by capturing an image with light having entered a second area at the image sensor different from the first area; and a selection unit that is configured to select either the first area or the second area as an area for image-capturing condition adjustment in reference to the first image and the second image displayed at the display unit.
Abstract: A display apparatus according to the present embodiment includes a storage section for storing distance measurement data acquired by performing a distance measurement and position data indicating a position at which the distance measurement was performed as a set, a position detecting section for detecting a current position, and a display section for displaying the distance measurement data stored as a set with the position data corresponding to the detection result of the current position detected by the position detecting section, within the distance measurement data stored in the storage section. By displaying, with the display section, the distance measurement data stored as a set with the position data corresponding to the detection result of the current position detected by the position detecting section, within the distance measurement data stored in the storage section, only necessary distance measurement data corresponding to the current position can be displayed to the user.
Abstract: An imaging element includes: an imaging unit in which a plurality of pixel groups including a plurality of pixels that output pixel signals according to incident light are formed, and on which incident light corresponding to mutually different pieces of image information is incident; a control unit that controls, for each of the pixel groups, a period of accumulating in the plurality of pixels included in the pixel group; and a readout unit that is provided to each of the pixel groups, and reads out the pixel signals from the plurality of pixels included in the pixel group.
Abstract: An optical system (LS) Includes lenses (L22,L23) satisfying the following conditional expressions, ?dLZ<35.0, and 0.702<?gFLZ+(0.00316×?dLZ), where ?dLZ: Abbe number of the lens with reference to d-line, and ?gFLZ: a partial dispersion ratio of the lens, wherein ?gFLZ is defined by the following expression, ?gFLZ?(ngLZ?nFLZ)/(nFLZ?nCLZ). wherein a refractive index of the lens with reference to g-line is ngLZ, a refractive index of the lens with reference to F-line is nFLZ, and a refractive index of the lens with reference to C-line is nCLZ.
Abstract: A detection device includes: a first detector which irradiates light onto the target object and detects light emitted from the target object; a first shape information generator which generates first shape information representing a first shape of the target object on the basis of a detection result of the first detector; and a second shape information generator which adds a second shape, which is based on information different from the detection result of the first detector, to the first shape, and which generates second shape information representing a shape including the first shape and the second shape.
Abstract: A processing apparatus includes: a beam irradiation apparatus that is configured to irradiate an object with an energy beam; and a beam deflection apparatus that is configured to change a propagating direction of the energy beam toward the beam irradiation apparatus, wherein when the energy beam propagating toward the beam irradiation apparatus from the beam deflection apparatus propagates in a first direction, the beam irradiation apparatus emits the energy beam in a second direction, and when the energy beam propagating toward the beam irradiation apparatus from the beam deflection apparatus propagates in a third direction that is different from the first direction, the beam irradiation apparatus emits the energy beam in a fourth direction that is different from the second direction.
Abstract: An image sensor includes: a first imaging region that captures an image of light entering through an optical system under a first imaging condition and generates a detection signal to perform focus detection of the optical system; and a second imaging region that captures an image of the light entering through the optical system under a second imaging condition other than the first imaging condition and generates an image signal.
Abstract: A first circuit layer including a first semiconductor substrate with photoelectric conversion unit that photoelectrically converts incident light and generates charge, and a first wiring layer with wiring that reads out signal based upon charge generated by the photoelectric conversion unit; second circuit layer including a second wiring layer with wiring connected to the wiring of the first wiring layer, and a second semiconductor substrate with a through electrode connected to the wiring of the second wiring layer; third circuit layer including a third semiconductor substrate with a through electrode connected to the through electrode of the second circuit layer, and third wiring layer with wiring connected to the through electrode of the third semiconductor substrate; and a fourth circuit layer including a fourth wiring layer with wiring connected to the wiring of the third wiring layer, and fourth semiconductor substrate connected to the wiring of the fourth wiring layer.
Abstract: An image capture apparatus includes: an acquisition unit configured to acquire image data of a subject; an identification unit configured to identify a color temperature of light from the subject on the basis of the image data acquired by the acquisition unit; an adjustment unit configured to adjust a white balance of the image data on the basis of the color temperature identified by the identification unit; and a suppression unit configured to suppress a chroma of the image data applied adjustment by the adjustment unit, if image data of a specific light-emitting body is included in the image data, on the basis of color information of the specific light-emitting body applied adjustment by the adjustment unit, by eliminating color of the specific light-emitting body applied adjustment by the adjustment unit.
Abstract: Laser radar systems include a pentaprism configured to scan a measurement beam with respect to a target surface. A focusing optical assembly includes a corner cube that is used to adjust measurement beam focus. Target distance is estimated based on heterodyne frequencies between a return beam and a local oscillator beam. The local oscillator beam is configured to propagate to and from the focusing optical assembly before mixing with the return beam. In some examples, heterodyne frequencies are calibrated with respect to target distance using a Fabry-Perot interferometer having mirrors fixed to a lithium aluminosilicate glass-ceramic tube.
Abstract: A detection device including: a detector that detects an object from one viewpoint; a reliability calculator that calculates reliability information on the object at the one viewpoint by using a detection result of the detector; and an information calculator that calculates shape information on the object at the one viewpoint by using the detection result of the detector and the reliability information and calculates texture information on the object at the one viewpoint by using the detection result, the information calculator generates model information on the object at the one viewpoint based on the shape information and the texture information.
Abstract: An optical imaging system includes a first lens system housed in a body of a mobile telecommunication device, the first lens system having a first optical axis, a first entrance pupil fixed in space in a reference plane associated with said body, and a first focal length; and an optical telescope providing a diffraction-limited imaging within a spectral range from at least 486 nm to at least 656 nm. The optical imaging system is configured to image, when the optical telescope is inserted between the first lens system and an entrance pupil of a visual system of an eye (EPE), the EPE onto the first entrance pupil and vice versa with a substantially unit magnification.
Abstract: A backside illumination image sensor that includes a semiconductor substrate with a plurality of photoelectric conversion elements and a read circuit formed on a front surface side of the semiconductor substrate, and captures an image by outputting, via the read circuit, electrical signals generated as incident light having reached a back surface side of the semiconductor substrate is received at the photoelectric conversion elements includes: a light shielding film formed on a side where incident light enters the photoelectric conversion elements, with an opening formed therein in correspondence to each photoelectric conversion element, and an on-chip lens formed at a position set apart from the light shielding film by a predetermined distance in correspondence to each photoelectric conversion element. The light shielding film and an exit pupil plane of the image forming optical system achieve a conjugate relation to each other with regard to the on-chip lens.
Abstract: A lens barrel includes a first lens holding frame that holds a first lens and has a first protruding portion, a first motor configured to move the first lens holding frame in an optical axis direction, a second lens holding frame that holds a second lens and has a second protruding portion, a second motor configured to move the second lens holding frame in the optical axis direction, and an outer barrel that has a straight groove engaging with the first protruding portion and the second protruding portion and extending in the optical axis direction, and is disposed further outward than the first lens holding frame and the second lens holding frame.