Abstract: The invention provides a safety diagnosis system for structure, which comprises one GNSS receiver installed on an upper floor of a structure, a control device having a storage unit for storing a program which prepares an absolute displacement curve of the structure based on an absolute coordinate measured by the GNSS receiver and a displacement of the absolute coordinate, calculates a maximum inter-layer displacement and a maximum inter-layer deformation angle per each floor based on the absolute displacement curve and prepares an inter-layer deformation angle curve and a judging unit for performing a diagnosis of the safety of the structure based on the maximum inter-layer displacement and the maximum inter-layer deformation angle, and a display unit, wherein the control device calculates the maximum inter-layer displacement and the maximum inter-layer deformation angle per each floor and the inter-layer deformation angle curve based on a displacement of the absolute coordinate and the program and makes the
Abstract: A reference signal having a known induced optical delay is used for phase stabilization of optical coherence tomography (OCT) interferograms, and for correcting sampling differences within OCT interferograms, in single mode and multimodal OCT systems. The reference signal can then be used to the measure time shift or sample clock period shifts induced in the interferogram signal by the OCT system. A corresponding OCT interferogram signal can then be corrected to remove the shift induced by the system based on the determination.
April 18, 2017
Date of Patent:
April 21, 2020
KABUSHIKI KAISHA TOPCON
Jonathan J. Liu, Zhenguo Wang, Jongsik Kim, Kinpui Chan
Abstract: An image acquiring device comprises a first camera 14 for acquiring video images, consisting of frame images continuous in time series, a second camera 15 being in a known relation with the first camera and used for acquiring two or more optical spectral images of an object to be measured, and an image pickup control device 21, and in the image acquiring device, the image pickup control device is configured to extract two or more feature points from one of the frame images, to sequentially specify the feature points in the frame images continuous in time series, to perform image matching between the frame images regarding the frame images corresponding to the two or more optical spectral images based on the feature points, and to synthesize the two or more optical spectral images according to the condition obtained by the image matching.
Abstract: An ophthalmic examination support system of an embodiment includes a server and clients. The server includes a medical information storage apparatus that stores medical information on each patient, and a management apparatus that manages the medical information. Each of the clients can communicate with the management apparatus. The system includes an association information storage unit and an examination condition obtaining unit. The association information storage unit stores, in advance, association information in which association between predetermined medical information items and examination conditions of an ophthalmic examination is recorded. The examination condition obtaining unit receives medical information retrieved from the medical information storage apparatus by the management apparatus based on patient information transmitted from one of the clients, and obtains an examination condition corresponding to at least part of the received medical information from the association information.
Abstract: A controller included in a blood flow measurement apparatus controls a scanner to iteratively scan one or more cross sections of an interested blood vessel. An image forming unit forms a phase image that represents chronological change in phase difference in the one or more cross sections based on data acquired through iterative scan. An image processor outputs a predetermined signal based on the chronological change in phase difference represented by the phase image. Upon receiving the predetermined signal, the controller controls the scanner to start scan for acquiring blood flow information on the interested blood vessel.
Abstract: An illumination system for projecting illumination light onto an eye. A left (right) light receiving system includes a left (right) objective lens and left (right) image sensor, and guides returning light of the illumination light to the left (right) image sensor via the left (right) objective lens. The objective optical axes of the left and right light receiving systems are disposed nonparallelly to each other. A projection system includes a projection system objective lens, and projects light onto the eye via the projection system objective lens. An optical scanner is used for scanning the eye with the light from the projection system. A deflection member is disposed near the objective optical axes, disposed in the optical path of the projection system between the optical scanner and the projection system objective lens, and deflects the optical path.
Abstract: A technique for efficiently calibrating a camera is provided. Reference laser scan data is obtained by scanning a building 131 by a laser scanner 115, which is fixed on a vehicle 100 and has known exterior orientation parameters, while the vehicle 100 travels. An image of the building 131 is photographed at a predetermined timing by an onboard camera 113. Reference point cloud position data, in which the reference laser scan data is described in a coordinate system defined on the vehicle 100 at the predetermined timing, is calculated based on the trajectory the vehicle 100 has traveled. Matching points are selected between feature points in the reference point cloud position data and in the image. Exterior orientation parameters of the camera 113 are calculated based on relative relationships between the reference point cloud position data and image coordinate values in the image of the matching points.
Abstract: Provided is a method of processing image data and detecting a region of an image represented by the image data to be excluded from an analysis of the image. According to the method, image data captured by a medical modality is received. An evaluation of a portion of the image data representing a two-dimensional view of a subject appearing in the image is conducted to locate, in the two-dimensional view, the region to be excluded from the analysis of the image. A feature pertinent to the analysis appearing in a remaining portion of the image, that is outside of the region to be excluded from the analysis located by the evaluation, is detected.
Abstract: Operations of a laser treatment apparatus are facilitated. A laser treatment apparatus of an embodiment includes an illumination system, observation system, irradiation system, illumination-area changing part, irradiation-condition setting part and controller. The illumination system illuminates an eye fundus. The observation system is used for observing the fundus illuminated. The irradiation system irradiates aiming light of a preset pattern and treatment light consisting of laser light of a pattern determined based on the preset pattern onto the fundus. The illumination-area changing part is used for changing an illumination area of the fundus by the illumination system. The irradiation-condition setting part sets irradiation condition of the aiming light and/or treatment light from the irradiation system. The controller controls the illumination-area changing part based on the set irradiation condition to change the illumination area.
Abstract: An optical coherence tomography (OCT) image composed of a plurality of A-scans of a structure is analyzed by defining, for each A-scan, a set of neighboring A-scans surrounding the A-slices scan. Following an optional de-noising step, the neighboring A-scans are aligned in the imaging direction, then a matrix X is formed from the aligned A-scans, and matrix completion is performed to obtain a reduced speckle noise image.
November 19, 2015
Date of Patent:
November 19, 2019
AGENCY FOR SCIENCE, TECHNOLOGY AND RESEARCH, KABUSHIKI KAISHA TOPCON
Abstract: A position at which evaluation of a defect was performed is easily identified. Relative positional relationships between three-dimensional coordinates of feature points of a photographed object and positions of a panoramic camera 111 are calculated based on a moving image, which is photographed by the panoramic camera 111 while a vehicle 100 travels. The position of the object photographed by a hyperspectral camera 114 is calculated based on the relative positional relationships and exterior orientation parameters of the hyperspectral camera 114 with respect to the panoramic camera 111.
Abstract: The system includes a survey machine having an image-taking section, a section of measuring a distance to a target and a section of measuring an angle, a pointing rod which is positioned on the measurement point X and includes, at a position deviated from a fixed length L from the measurement point, the prism, and an inclination sheet having a mark. The three-dimensional position of the measurement point is measured by equipping the inclination sheet to the pointing rod, imaging a mark surface in the image-taking section, calculating the inclination angle of the inclination sheet with respect to the eye direction from the survey machine by image-analyzing the mark surface, and determining the three-dimensional position of the measurement point from a three-dimensional position of the prism, the inclination angle of the inclination sheet and the fixed length.
Abstract: According to one embodiment, an ophthalmic microscope system includes an illumination system, a pair of light-receiving systems, and an irradiation system. The illumination system is configured to irradiate a subject's eye with illumination light. Each of the light-receiving systems includes a first objective lens and a first imaging device, and is configured to guide the illumination light returning from the subject's eye to the first imaging device through the first objective lens. The objective optical axes of the light-receiving systems are not parallel to each other. The irradiation system is configured to irradiate the subject's eye with light different from the illumination light from a direction different from the objective optical axes.
Abstract: The objects of an embodiment are to improve the reliability of visual function examination and to shorten the time required for the examination. A visual function examination apparatus of an embodiment includes an application optical system, a biological information detector, and an evaluation information generator. The application optical system includes an optical scanner disposed in an optical path of laser light output from a laser light source, and is configured to apply the laser light that has travelled via the optical scanner to a retina of a subject's eye. The biological information detector is configured to detect biological information representing a reaction of a subject to application of the laser light. The evaluation information generator is configured to generate evaluation information on visual function of the subject's eye based on the biological information detected.
Abstract: An ophthalmologic imaging apparatus that includes a first optical system, a first driver, and a first focus controller. The first optical system includes a first focus lens and a diopter correction lens, and guides light from a subject's eye to a first light receiving element. The first focus lens is movable along the optical axis of a first optical path. The diopter correction lens is insertable into and removable from the first optical path. The first driver moves the first focus lens. The first focus controller executes mutually different focus control of the first driver in a removed state in which the first diopter correction lens is removed from the first optical path and in an inserted state in which the diopter correction lens is inserted into the first optical path.
Abstract: An ophthalmic image display device having, a display controller that displays a B mode image, a blood vessel enhanced image representing a same cross section as the B mode image, and one or more front images individually formed based on a three dimensional data set acquired by performing optical coherence tomography on a subject's eye in a predetermined layout. Further, the display controller displays a cross section position indicator that indicates a position of a cross section of the B mode image over at least one of the one or more front images. In addition, the display controller synchronously performs changing of a display position of the cross section position indicator and updating of a display of each of the B mode image and the blood vessel enhanced image in accordance with an operation for moving the cross section position indicator performed using an operation unit.
Abstract: In an ophthalmic operation microscope, an illumination optical system illuminates a patient's eye with illumination light. An observation optical system is used for observing the patient's eye illuminated. An objective lens is disposed in an observation optical path. An interference optical system splits light from a light source into measurement light and reference light, and detects interference light generated from returning light of the measurement light from the patient's eye and the reference light. A first lens group is disposed between the light source and the patient's eye in an optical path of the measurement light. A second lens group is disposed between the first lens group and the patient's eye in the optical path of the measurement light. A deflection member is disposed between the first lens group and the second lens group in the optical path of the measurement light.
Abstract: A fundus analysis apparatus includes a storage, an area setting unit, and a morphological information generating unit. The storage is configured to store OCT information acquired by applying optical coherence tomography to the fundus of an eye. The area setting unit is configured to set a front area corresponding to a front surface of the lamina cribrosa and a rear area corresponding to a rear surface of the lamina cribrosa in the OCT information. The morphological information generating unit is configured to generate morphological information indicating the morphology of the lamina cribrosa based on at least the front area and the rear area.
Abstract: A carrying case has a flat top surface and bottom surface aligned in a first direction (Z direction) and includes two top surface rib grooves, extending in a second direction (X direction) perpendicular to the first direction and recessed in the first direction, on the top surface and four interfering projections, projecting in the first direction, on the bottom surface. Both ends of one of the top surface side rib grooves seen from the second direction are inclined toward one side in a third direction (Y direction) perpendicular to the second direction and both ends of the other top surface side rib groove seen from the second direction are inclined toward the other side in the third direction. The respective interfering projections are disposed corresponding to the respective inclining portions such that the interfering projections touch the inclining portions from inside.
Abstract: A technique for making multiple still images, which are photographed by a traveling mobile body, and the traveled route of the photographing correspond to each other, is obtained. A survey data processing device includes an input unit 101, an image processing unit 102, and a synchronous processing unit 103. The input unit 101 is configured to receive image data of multiple still images, which are photographed from a mobile body flying, and receive flight data, in which a flight route of the mobile body is measured. The image processing unit 102 is configured to estimate a flight route of the mobile body based on changes in positions of feature points included in the multiple still images on a screen. The synchronous processing unit 103 is configured to specify a matching relationship between the flight route of the flight data and the estimated flight route.