Multiple Cameras On Baseline (e.g., Range Finder, Etc.) Patents (Class 348/139)
  • Publication number: 20120327224
    Abstract: An information processing apparatus includes an imaging unit and is capable of setting arrangement of a structural member of a robot system which works based on an image captured by the imaging unit. The information processing apparatus includes an arrangement unit configured to arrange a virtual object corresponding to the structural member in a virtual space corresponding to a working space of the robot system, a first acquisition unit configured to acquire a virtual space image in the virtual space which corresponds to the captured image and in which the virtual object is arranged, and a second acquisition unit configured to acquire an evaluation value indicating adaptation of arrangement of the virtual object to the work of the robot system based on the virtual space image.
    Type: Application
    Filed: March 7, 2011
    Publication date: December 27, 2012
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Osamu Nomura, Masakazu Matsugu
  • Publication number: 20120320195
    Abstract: The present invention relates to a system and method for determining vehicle attitude and position from image data detected by sensors in a vehicle. The invention uses calculated differences between the locations of selected features in an image plane and the location of corresponding features in a terrain map to determine the attitude of the vehicle carrying the sensors with respect to a ground frame of reference.
    Type: Application
    Filed: August 23, 2012
    Publication date: December 20, 2012
    Inventors: Gene D. TENER, Andrew H. Hawkins, Mark A. Bovankovich, Louis N. Glaros
  • Patent number: 8335345
    Abstract: The path and/or position of an object is tracked using two or more cameras which run asynchronously so there is need to provide a common timing signal to each camera. Captured images are analyzed to detect a position of the object in the image. Equations of motion for the object are then solved based on the detected positions and a transformation which relates the detected positions to a desired coordinate system in which the path is to be described. The position of an object can also be determined from a position which meets a distance metric relative to lines of position from three or more images. The images can be enhanced to depict the path and/or position of the object as a graphical element. Further, statistics such as maximum object speed and distance traveled can be obtained. Applications include tracking the position of a game object at a sports event.
    Type: Grant
    Filed: March 19, 2007
    Date of Patent: December 18, 2012
    Assignee: Sportvision, Inc.
    Inventors: Marvin S. White, Alina Alt
  • Patent number: 8326092
    Abstract: The present invention relates to machine vision computing environments, and more specifically relates to a system and method for selectively accelerating the execution of image processing applications using a hybrid computing system. To this extent, a hybrid system is generally defined as one that is multi-platform, and potentially distributed via a network or other connection. The invention provides a machine vision system and method for executing image processing applications on a hybrid image processing system referred to herein as an image co-processor that comprises (among other things) a plurality of special purpose engines (SPEs) that work to process multiple images in an accelerated fashion.
    Type: Grant
    Filed: April 23, 2007
    Date of Patent: December 4, 2012
    Assignee: International Business Machines Corporation
    Inventors: William H. Chung, Moon J. Kim, James R. Moulic, Toshiyuki Sanuki
  • Patent number: 8326023
    Abstract: A memory of a photographing field angle calculation apparatus has stored therein the position of each point captured by an imaging system as three coordinate values in horizontal, vertical, and depth directions when a photography space is photographed by the imaging system. A usage pixel value extraction means selects a plurality of points located in end portions in the horizontal direction of a range captured by the imaging system from those stored in the memory based on the vertical coordinate value, and extracts the horizontal and depth coordinate values of each of the selected points. A field angle calculation means calculates a horizontal photographing field angle when the photography space is photographed using the extracted horizontal and depth coordinate values.
    Type: Grant
    Filed: August 8, 2008
    Date of Patent: December 4, 2012
    Assignee: FUJIFILM Corporation
    Inventor: Yoichi Sawachi
  • Patent number: 8310554
    Abstract: A system for tracking at least one object is disclosed. The system includes a plurality of communicatively connected visual sensing units configured to capture visual data related to the at least one object The system also includes a manager component communicatively connected to the plurality of visual sensing units. The manager component is configured to assign one visual sensing unit to act as a visual sensing unit in a master mode and at least one visual sensing unit to act as a visual sensing unit in a slave mode. The manager component is further configured to transmit at least one control signal to the plurality of visual sensing units, and receive the visual data from the plurality of visual sensing units.
    Type: Grant
    Filed: September 20, 2006
    Date of Patent: November 13, 2012
    Assignee: SRI International
    Inventors: Manoj Aggarwal, Deepak Sethi, Supun Samarasekera, Vince Paragano
  • Patent number: 8233079
    Abstract: In an environment recognition apparatus including a light projector that intermittently projects a light pattern toward an object to be measured existing in an environmental space in accordance with a duty factor of a pulse train defining one frame, a camera that outputs a difference image between an image of the object taken at an exposure where the light pattern is projected and an image of the object taken at an exposure where the light pattern is not projected and the object is recognized based on the difference image, there is equipped with a timing controller that controls the projection timing by varying a pulse repetition period in the pulse train in one frame at random, thereby effectively avoiding the interference with the other while using a camera of ordinary sensitivity.
    Type: Grant
    Filed: January 15, 2008
    Date of Patent: July 31, 2012
    Assignee: Honda Motor Co., Ltd.
    Inventor: Chiaki Aoyama
  • Patent number: 8224024
    Abstract: The spatial location and azimuth of an object are computed from the locations, in a single camera image, of exactly two points on the object and information about an orientation of the object. One or more groups of four or more collinear markers are located in an image, and for each group, first and second outer markers are determined, the distances from each outer marker to the nearest marker in the same group are compared, and the outer marker with a closer nearest marker is identified as the first outer marker. Based on known distances between the outer markers and the marker nearest the first outer marker, an amount of perspective distortion of the group of markers in the image is estimated. Based on the perspective distortion, relative distances from each other point in the group to one of the outer markers are determined. Based on the relative distances, the group is identified.
    Type: Grant
    Filed: October 4, 2006
    Date of Patent: July 17, 2012
    Assignee: InterSense, LLC
    Inventors: Eric Foxlin, Leonid Naimark
  • Patent number: 8218004
    Abstract: A displacement sensing system is disclosed. Two image capturing devices are settled at to opposite ends of a coordinate axis of the planar area, respectively, for capturing images of the planar area and an object placed thereon. Four pre-established LUT databases and an interactive four-matrix lookup table process are implemented to determine actual coordinates of the object on in the planar area.
    Type: Grant
    Filed: October 22, 2009
    Date of Patent: July 10, 2012
    Assignee: Cycling & Health Tech Industry R&D Center
    Inventors: Chern-Sheng Lin, Chia-Tse Chen, Tzu-Chi Wei, Wei-Lung Chen, Chia-Chang Chang
  • Publication number: 20120162414
    Abstract: The present disclosure provides a global calibration method based on a rigid bar for a multi-sensor vision measurement system, comprising: step 1, executing the following procedure for at least nine times: placing, in front of two vision sensors to be calibrated, a rigid bar fasten with two targets respectively corresponding to the vision sensors; capturing images of the respective targets by their corresponding vision sensors; extracting coordinates of feature points of the respective targets in their corresponding images; and computing 3D coordinates of each feature points of the respective targets under their corresponding vision sensor coordinate frames; and Step 2, computing the transformation matrix between the two vision sensors, with the constraint of the fixed position relationship between the two targets. The present disclosure also provides a global calibration apparatus based on a rigid bar for a multi-sensor vision measurement system.
    Type: Application
    Filed: August 9, 2011
    Publication date: June 28, 2012
    Applicant: BEIHANG UNIVERSITY
    Inventors: Guangjun Zhang, Zhen Liu, Zhenzhong Wei, Junhua Sun, Meng Xie
  • Patent number: 8203607
    Abstract: A computer-implemented method for automatic image registration includes providing a previously aligned image pair, extracting a first number of samples of features from the previously aligned image pair, and extracting a second number of samples of features from an observed image pair. The method further includes determining a Euclidean minimum spanning tree of a union set of the samples of features from the previously aligned image pair and the observed image pair, determining a similarity measure of the observed image pair based on the Euclidean minimum spanning tree, estimating a gradient of the similarity measure, wherein a gradient estimate is used to update transformation parameters applied to register the observed image pair, and outputting a registration of the observed image pair, wherein the registration of the observed image pair is one of displayed and stored on a storage media.
    Type: Grant
    Filed: June 6, 2006
    Date of Patent: June 19, 2012
    Assignee: Siemens Medical Solutions USA, Inc.
    Inventors: Mert Rory Sabuncu, Christophe Chefd'hotel
  • Patent number: 8194126
    Abstract: An imaging system, including a transmission source providing pulse(s), and a gated sensor for receiving pulse reflections from objects located beyond a minimal range. The pulse and the gate timing are controlled for creating a sensitivity as a function of range, such that the amount of the energy received progressively increases with the range. Also an imaging method, including emitting pulse(s) to a target area, receiving reflections of pulses reflected from objects located beyond a minimal range, the receiving includes gating detection of the reflections, and progressively increasing the received energy of the reflections, by controlling the pulses and the timing of the gating.
    Type: Grant
    Filed: July 27, 2006
    Date of Patent: June 5, 2012
    Assignee: Elbit Systems Ltd.
    Inventors: Ofer David, Shamir Inbar
  • Patent number: 8180107
    Abstract: A method and system for coordinated tracking of objects is disclosed. A plurality of images is received from a plurality of nodes, each node comprising at least one image capturing device. At least one target in the plurality of images is identified to produce at least one local track corresponding to each of the plurality of nodes having the at least one target in its field of view. The at least one local track corresponding to each of the plurality of nodes is fused according to a multi-hypothesis tracking method to produce at least one fused track corresponding to the at least one target. At least one of the plurality of nodes is assigned to track the at least one target based on minimizing at least one cost function comprising a cost matrix using the k-best algorithm for tracking at least one target for each of the plurality of nodes. The at least one fused track is sent to the at least one of the plurality of nodes assigned to track the at least one target based on the at least one fused track.
    Type: Grant
    Filed: February 11, 2010
    Date of Patent: May 15, 2012
    Assignee: SRI International
    Inventors: Christopher P. Broaddus, Thomas Germano, Nicholas Vandervalk, Shunguang Wu, Ajay Divakaran, Harpreet Singh Sawhney
  • Patent number: 8159525
    Abstract: Multiport multispectral portable imaging systems having at least two cameras with charge-coupled device sensors, a front lens unit, at least two rear lens units, a beamsplitter, and at least two bandpass filters is used to detect contaminants on food.
    Type: Grant
    Filed: June 9, 2009
    Date of Patent: April 17, 2012
    Assignee: The United States of America, as represented by the Secretary of Agriculture
    Inventors: Bosoon Park, Michio Kise, Kurt C. Lawrence, William Robert Windham
  • Publication number: 20120081542
    Abstract: The obstacle detecting system includes a first image acquiring unit which acquires first image information by selectively receiving a laser beam emitted from at least one laser source toward a road surface at a target distance; a second image acquiring unit which acquires an image of actual surroundings as second image information; an image recognizing unit which recognizes an image of an obstacle by performing 3-D image recognition signal processing on line information of the laser beam using the first image information, and recognizes a pattern of the obstacle by performing pattern recognition signal processing on the second image information; and a risk determining unit which determines a possibility of collision due to presence of the obstacle within the target distance by classifying the recognized obstacles according to whether or not the image-recognized obstacle is matched with the pattern-recognized obstacle.
    Type: Application
    Filed: July 8, 2011
    Publication date: April 5, 2012
    Applicants: ANDONG UNIVERSITY INDUSTRY-ACADEMIC COOPERATION FOUNDATION, Electronics and Telecommunications Research Institute
    Inventors: Jung Hee SUK, Chun Gi Lyuh, Ik Jae Chun, Wook Jin Chung, Jeong Hwan Lee, Jae Chang Shim, Tae Moon Roh
  • Publication number: 20120063637
    Abstract: An array of image sensors is arranged to cover a field of view for an image capture system. Each sensor has a field of view segment which is adjacent to the field of view segment covered by another image sensor. The adjacent field of view (FOV) segments share an overlap area. Each image sensor comprises sets of light sensitive elements which capture image data using a scanning technique which proceeds in a sequence providing for image sensors sharing overlap areas to be exposed in the overlap area during the same time period. At least two of the image sensors capture image data in opposite directions of traversal for an overlap area. This sequencing provides closer spatial and temporal relationships between the data captured in the overlap area by the different image sensors. The closer spatial and temporal relationships reduce artifact effects at the stitching boundaries, and improve the performance of image processing techniques applied to improve image quality.
    Type: Application
    Filed: September 15, 2010
    Publication date: March 15, 2012
    Applicant: MICROSOFT CORPORATION
    Inventor: John A. Tardif
  • Publication number: 20120050529
    Abstract: Portable wireless mobile device motion capture and analysis system and method configured to display motion capture/analysis data on a mobile device. System obtains data from motion capture elements and analyzes the data. Enables unique displays associated with the user, such as 3D overlays onto images of the user to visually depict the captured motion data. Ratings associated with the captured motion can also be displayed. Predicted ball flight path data can be calculated and displayed. Data shown on a time line can also be displayed to show the relative peaks of velocity for various parts of the user's body. Based on the display of data, the user can determine the equipment that fits the best and immediately purchase the equipment, via the mobile device. Custom equipment may be ordered through an interface on the mobile device from a vendor that can assemble-to-order customer built equipment and ship the equipment.
    Type: Application
    Filed: August 26, 2010
    Publication date: March 1, 2012
    Inventor: Michael BENTLEY
  • Patent number: 8125525
    Abstract: An information processing apparatus includes: a reception unit that receives multiple captured images that are captured with a capture device in parallel from an external apparatus to which the capture device that captures an object is connected; a detection unit that detects a priority image that is included in the multiple captured images being received by the reception unit and is assigned priority information that specifies the priority image to be preferentially displayed on a display device; and a controller that controls the reception unit to stop receiving other captured images which are not detected by the detection unit and to receive the priority image detected by the detection unit and controls the display device to display the priority image on the display device when the priority image is detected by the detection unit.
    Type: Grant
    Filed: April 17, 2009
    Date of Patent: February 28, 2012
    Assignee: Fuji Xerox Co., Ltd.
    Inventor: Kiwame Tokai
  • Publication number: 20120033071
    Abstract: A position and orientation measurement apparatus comprises: a distance information obtaining unit adapted to obtain distance information of a target object captured by a capturing unit; a grayscale image obtaining unit adapted to obtain a grayscale image of the target object; a first position and orientation estimation unit adapted to estimate a position and orientation of the target object based on the information of a three-dimensional shape model and the distance information; a second position and orientation estimation unit adapted to estimate a position and orientation of the target object based on a geometric feature of the grayscale image and projection information obtained by projecting, on the grayscale image, the information of the three-dimensional shape model; and a determination unit adapted to determine whether a parameter of the capturing unit is needed to be calibrated or not, based on both a first and second estimated values.
    Type: Application
    Filed: August 1, 2011
    Publication date: February 9, 2012
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Kazuhiko Kobayashi, Shinji Uchiyama
  • Patent number: 8089513
    Abstract: Provided is an information processing apparatus including an image acquisition unit for acquiring a real space image including an image of another apparatus, a coordinate system generation unit for generating a spatial coordinate system of the real space image acquired by the image acquisition unit, and a transmission unit for transmitting spatial information constituting the spatial coordinate system generated by the coordinate system generation unit to the other apparatus sharing the spatial coordinate system.
    Type: Grant
    Filed: November 2, 2010
    Date of Patent: January 3, 2012
    Assignee: Sony Corporation
    Inventors: Akira Miyashita, Kazuhiro Suzuki, Hiroyuki Ishige
  • Publication number: 20110316814
    Abstract: An optical distance determination device includes a first image sensor, an auxiliary image sensor, and a distance measuring module. The first image sensor is for capturing a first image comprising image information of an object. The auxiliary image sensor is for capturing an auxiliary image comprising image information of the object, and a physical distance between the first image sensor and the auxiliary image sensor. The distance measuring module is for calculating pixel distance of an image formation position of the object in the first image and the auxiliary image for calculating an approximate distance between the object and the first image sensor according to the pixel distance, the physical distance, and focal lengths of the first image sensor and the auxiliary image sensor.
    Type: Application
    Filed: December 14, 2010
    Publication date: December 29, 2011
    Inventors: Ming-Tsan Kao, Tzu-Yu Chen
  • Publication number: 20110288804
    Abstract: Disclosed are a sighting apparatus for a remote-control shooting system and a sighting alignment method using the same. The sighting apparatus for a remote-control shooting system including a firearm installed in a firearm platform rotatable in up, down, left and right directions, and an observation camera installed to be position-adjustable in at the firearm platform, the sighting apparatus includes: a sighting unit which is fastened to the firearm with a zeroing unit precisely up, down, left and right adjustable therebetween, and takes a first image as zeroed; and a controller which controls the firearm platform so that an sighting indicator (e.g., a line) of the first image taken by the sighting unit can be aligned with a target, and aims at the target.
    Type: Application
    Filed: May 18, 2011
    Publication date: November 24, 2011
    Inventors: Dong Hee Lee, In Jung, Gyu Jung Choi
  • Publication number: 20110273562
    Abstract: A sports simulation system (100) comprises at least two imaging devices (128) capturing images of a projectile tracking region disposed in front of a display surface (124) from different vantages to detect a launched projectile traveling through the projectile tracking region towards the display surface; a projectile spin sensing unit (105) capturing images of a region at least partially overlapping with the projectile tracking region, each captured image comprising a projectile trail representing a travel path of the projectile when a projectile is present in the region during image capture; and at least one processing stage (104) receiving data from the imaging devices (128) and the projectile spin sensing unit (105) and determining the three-dimensional positions, velocity, acceleration and spin of a detected launched projectile traveling through the projectile tracking region, the three-dimensional positions, velocity, acceleration and spin being used by the at least one processing stage to calculate a tr
    Type: Application
    Filed: October 7, 2009
    Publication date: November 10, 2011
    Applicant: INTERACTIVE SPORTS TECHNOLOGIES INC.
    Inventors: Wayne Dawe, Zuqiang Zhao
  • Publication number: 20110234791
    Abstract: Provided is an information processing apparatus including an image acquisition unit for acquiring a real space image including an image of another apparatus, a coordinate system generation unit for generating a spatial coordinate system of the real space image acquired by the image acquisition unit, and a transmission unit for transmitting spatial information constituting the spatial coordinate system generated by the coordinate system generation unit to the other apparatus sharing the spatial coordinate system.
    Type: Application
    Filed: November 2, 2010
    Publication date: September 29, 2011
    Applicant: SONY CORPORATION
    Inventors: Akira Miyashita, Kazuhiro Suzuki, Hiroyuki Ishige
  • Publication number: 20110228082
    Abstract: A measuring system for a 3D object is disclosed. The system comprises a base, a horizontal scanning device disposed on the base, a first light emitting device, a second light emitting device, an image capture device, and a control device. The first light emitting device, the second light emitting device, and the image capture device are connected to the horizontal scanning device. The control device controls the horizontal scanning device to cause planar motion relative to the base. With the control of the control device, the first light emitting device and the second light emitting device alternate in projecting a light source onto a 3D object. The image capture device comprises a double-sided telecentric lens to capture a plurality of images of the 3D object when the light source is projected onto the 3D object.
    Type: Application
    Filed: October 4, 2010
    Publication date: September 22, 2011
    Inventors: Kuang-Pu Wen, Don Lin, Liang-Pin Yu
  • Patent number: 8022986
    Abstract: A weapon orientation measuring device in accordance with the disclosure includes a processor configured to receive first location information indicative of locations of a first point and a second point on a weapon, the first and second points being a known distance apart in a direction parallel to a pointing axis of the weapon, and to receive second location information indicative of the locations of the first and second points on the weapon. The processor is further configured to receive information indicative of a first earth orientation, and determine a second earth orientation corresponding to the weapon based on the first and second location information and the information indicative of the first earth orientation. The first location information represents location relative to a first sensor at a first location and the second location information represents location relative to a second sensor at a second location, and the first and second sensors are separated by a given distance.
    Type: Grant
    Filed: May 14, 2010
    Date of Patent: September 20, 2011
    Assignee: Cubic Corporation
    Inventor: Richard N. Jekel
  • Publication number: 20110211068
    Abstract: There is provided an image pickup apparatus for use in measuring a distance from an observer to a target. The image pickup apparatus includes a case; and a plurality of cameras configured to capture images of the target is fixed in the case. In the image pickup apparatus, the distance from the observer to the target is measured based on the images of the target captured by altering baseline lengths for parallax computation obtained by combining any two of the cameras.
    Type: Application
    Filed: February 11, 2011
    Publication date: September 1, 2011
    Inventor: Soichiro YOKOTA
  • Patent number: 7990415
    Abstract: Provided is an image input device which includes a laser range finder and a camera, and is capable of automatically calibrating the laser range finder and the camera at an appropriate timing without using special equipment. The image input device includes the laser range finder which measures distance information of an object by using invisible light and the camera which measures color information of the object. In order to detect a calibration error between the laser range finder and the camera, an invisible light filter which blocks visible light and transmits invisible light is automatically attached to a lens of the camera by a switching operation between two kinds of lenses. By the camera to which the invisible light filter is being attached, a pattern of the invisible light projected onto the object from the laser range finder is photographed as a visible image.
    Type: Grant
    Filed: April 4, 2006
    Date of Patent: August 2, 2011
    Assignee: Hitachi, Ltd.
    Inventors: Kosei Matsumoto, Toshio Moriya, Tsuyoshi Minakawa
  • Publication number: 20110141274
    Abstract: A depth detection method includes the following steps. First, first and second video data are shot. Next, the first and second video data are compared to obtain initial similarity data including r×c×d initial similarity elements, wherein r, c and d are natural numbers greater than 1. Then, an accumulation operation is performed, with each similarity element serving as a center, according to a reference mask to obtain an iteration parameter. Next, n times of iteration update operations are performed on the initial similarity data according to the iteration parameter to generate updated similarity data. Then, it is judged whether the updated similarity data satisfy a character verification condition. If yes, the updated similarity data is converted into depth distribution data.
    Type: Application
    Filed: July 23, 2010
    Publication date: June 16, 2011
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Chih-Pin Liao, Yao-Yang Tsai, Jay Huang, Ko-Shyang Wang
  • Publication number: 20110118973
    Abstract: An image processing method and a system are provided. The image processing method of moving camera comprises the following steps. An image of a road is captured by a first camera unit. A coordinate of the image of an object shown in the image of the road is captured when the image of the object shown in the image of the road is selected. At least an aiming angle of a second camera unit is adjusted according to the coordinate to make the field-of-view of the second camera unit aligned with the object. The image of the object is captured by the second camera unit. The image of the object is enlarged.
    Type: Application
    Filed: July 8, 2010
    Publication date: May 19, 2011
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Ming-Yu SHIH, Lih-Guong Jang, Jen-Hui Chuang, Chin-Chung Nien, Hsiao-Hu Tan, Chen-Chin Cheng, Shyi-Shing Hsieh
  • Publication number: 20110096161
    Abstract: A displacement sensing system is disclosed. Two image capturing devices are settled at to opposite ends of a coordinate axis of the planar area, respectively, for capturing images of the planar area and an object placed thereon. Four pre-established LUT databases and an interactive four-matrix lookup table process are implemented to determine actual coordinates of the object on in the planar area.
    Type: Application
    Filed: October 22, 2009
    Publication date: April 28, 2011
    Inventors: Chern-Sheng Lin, Chia-Tse Chen, Tzu-Chi Wei, Wei-Lung Chen, Chia-Chang Chang
  • Publication number: 20110090333
    Abstract: An optical inspection system for inspecting a substrate is provided. The system includes an array of cameras configured to acquire a plurality of sets of images as the substrate and the array undergo relative motion with respect to each other. At least one focus actuator is operably coupled to each camera of the array of cameras to cause displacement of at least a portion of each camera that affects focus. A substrate range calculator is configured to receive at least portions of images from the array and to calculate range between the array of cameras and the substrate. A controller is coupled to the array of cameras and to the range calculator. The controller is configured to provide a control signal to each of the at least one focus actuator to adaptively focus each camera of the array during the relative motion.
    Type: Application
    Filed: November 4, 2010
    Publication date: April 21, 2011
    Inventors: Carl E. Haugan, Steven K. Case, Beverly Caruso, Timothy A. Skunes
  • Publication number: 20110058033
    Abstract: A precision motion platform carrying an imaging device under a large-field-coverage lens enables capture of high resolution imagery over the full field in an instantaneous telephoto mode and wide-angle coverage through temporal integration. The device permits automated tracking and scanning without movement of a camera body or lens. Coupled use of two or more devices enables automated range computation without the need for subsequent epipolar rectification. The imager motion enables sample integration for resolution enhancement. The control methods for imager positioning enable decreasing the blur caused by both the motion of the moving imager or the motion of an object's image that the imager is intended to capture.
    Type: Application
    Filed: November 16, 2010
    Publication date: March 10, 2011
    Inventors: Henry H. Baker, John I. Woodfill, Pierre St. Hilaire, Nicholas R. Kalayjian
  • Publication number: 20110058032
    Abstract: A face detection apparatus and method are provided. The apparatus may acquire a distance difference image through a stereo camera and create an object mask using the distance difference image to detect a face candidate area. The apparatus may also determine a size of a search window using the distance difference image and detect a facial area in the face candidate area. Accordingly, an operation speed for face detection can be improved.
    Type: Application
    Filed: August 3, 2010
    Publication date: March 10, 2011
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Dong-ryeol PARK, Yeon-ho KIM
  • Patent number: 7898569
    Abstract: Angled axis machine vision system having a camera system angled with respect to an axis of coordinate system of environment. Eliminates problem of utilizing horizontal, vertical lines in environment for distance calculations when horizontal and vertical lines are parallel or close to parallel to axis lying between camera centers of camera system. With camera centers angled about roll axis, horizontal and vertical lines in environment appear as angled lines in images taken from the cameras enabling more accurate distance calculations. With angled axis rotation it is still possible for lines in environment to be parallel to axis defined between camera centers, but instances are rare in real world environments. Camera mount may rotate wherein two sets of pictures are taken and two sets compared for the number of lines which are parallel to axis of camera centers wherein set of pictures with least lines parallel is used for distance calculations.
    Type: Grant
    Filed: February 15, 2007
    Date of Patent: March 1, 2011
    Assignee: Vision Robotics Corporation
    Inventors: Harvey Koselka, Bret Wallach
  • Publication number: 20100328457
    Abstract: An apparatus for acquiring distance information and images includes an image sensor unit including a left image sensor, a central image sensor, and a right image sensor, which are arranged on one board while being spaced apart from one another at a predetermined interval, and a distance information extraction unit that extracts distance information by comparing an image of a reference object acquired through the central image sensor with an image of the reference object acquired through the left image sensor and an image of the reference object acquired through the right image sensor.
    Type: Application
    Filed: June 24, 2010
    Publication date: December 30, 2010
    Applicant: SILICONFILE TECHNOLOGIES INC.
    Inventor: Byoung-Su Lee
  • Publication number: 20100328456
    Abstract: An apparatus, method and software construct an image of a scene by determining distance information to an object in a scene; and using the distance information to correct parallax error when combining at least two images of the object which were captured from different viewpoints. A single image of the scene from the combining is then output, with the object corrected for parallax error. In one embodiment the distance information is input from an autofocus mechanism of a multi-camera imaging system, and in another the distance information is derived from one of an object recognition algorithm or a scene analysis algorithm. The two (or more) images that are combined are captured preferably on different lenslet cameras of the same multi-camera imaging system, each of which sees the object in the scene from a different viewpoint.
    Type: Application
    Filed: June 30, 2009
    Publication date: December 30, 2010
    Inventor: Juha H. Alakarhu
  • Publication number: 20100295942
    Abstract: A weapon orientation measuring device in accordance with the disclosure includes a processor configured to receive first location information indicative of locations of a first point and a second point on a weapon, the first and second points being a known distance apart in a direction parallel to a pointing axis of the weapon, and to receive second location information indicative of the locations of the first and second points on the weapon. The processor is further configured to receive information indicative of a first earth orientation, and determine a second earth orientation corresponding to the weapon based on the first and second location information and the information indicative of the first earth orientation. The first location information represents location relative to a first sensor at a first location and the second location information represents location relative to a second sensor at a second location, and the first and second sensors are separated by a given distance.
    Type: Application
    Filed: May 14, 2010
    Publication date: November 25, 2010
    Applicant: Cubic Corporation
    Inventor: Richard N. Jekel
  • Patent number: 7804518
    Abstract: A method for enhancing underwater imaging affected by image degradation effects, the method comprising: acquiring at least one image of an underwater scene using an imaging device; determining information regarding distances of parts of the scene relative to the imaging device; and reconstructing an image of the underwater scene using a physics-based mathematical model, compensating image characteristics influenced by distance-dependent underwater degradation effects including veiling light, using the information on the distances of parts of the scene from the imaging device, and compensating distance-dependent underwater degradation effects relating to the distance of illumination sources from the scene.
    Type: Grant
    Filed: February 13, 2005
    Date of Patent: September 28, 2010
    Assignee: Technion Research and Development Foundation Ltd.
    Inventors: Yoav Schechner, Nir Karpel
  • Patent number: 7782499
    Abstract: An image scanning apparatus with a preview function includes a scanning unit that scans a plurality of documents to obtain scanned images of the plurality of documents; a display unit; and a control unit that controls the display unit to display the scanned images of the plurality of documents as preview images of the plurality of documents.
    Type: Grant
    Filed: April 26, 2006
    Date of Patent: August 24, 2010
    Assignee: Samsung Electronics Co., Ltd.
    Inventor: Hyung-sik Byun
  • Patent number: 7777781
    Abstract: Multiple motion sensor devices are included in an imaging apparatus. One or more of the motion sensor devices captures images and at least a portion of the sensor images are then processed to determine one or more motion vectors for the processed sensor images. At least one motion vector representing an amount of motion of the imager during image capture is generated when a predetermined portion of the motion vectors match.
    Type: Grant
    Filed: August 26, 2005
    Date of Patent: August 17, 2010
    Assignee: Avago Technologies ECBU IP (Singapore) Pte. Ltd.
    Inventors: Rene P. Helbing, William R. Trutna, Jr.
  • Publication number: 20100157048
    Abstract: A positioning system and a method thereof are provided. In the positioning method, a first and a second pose information of a moving device are obtained by a first positioning device and a second positioning device respectively, wherein the first pose information corresponds to the second pose information. In addition, a plurality of first candidacy pose information is generated in an error range of the first pose information. Furthermore, a plurality of second candidacy pose information is generated according to the first pose information respectively. One of the second candidacy pose information having a smallest error derived from the second pose information is selected for updating the pose information of the first positioning device and parameter information of the second positioning device. Thereby, pose information of the moving device is updated and parameter information of the second orientation devices is calibrated simultaneously.
    Type: Application
    Filed: February 17, 2009
    Publication date: June 24, 2010
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Hsiang-Wen Hsieh, Jwu-Sheng Hu, Shyh-Haur Su, Chin-Chia Wu
  • Publication number: 20100141759
    Abstract: A method and system obtains precise survey-grade position data of target points in zones where precise GPS data cannot be obtained, due to natural or man-made objects such as foliage and buildings. The system comprises position measurement components such as a GPS receiver, together with an inertial measurement unit (IMU) and an electronic distance meter (EDM) or an image capture device all mounted on a survey pole. The system and method obtains GPS data when outside the zone and uses the other position measurement systems, such as the IMU, inside the zone to traverse to a target point. The EDM or the image capture device may be used within or without the zone to obtain data to reduce accumulated position errors.
    Type: Application
    Filed: November 16, 2009
    Publication date: June 10, 2010
    Inventor: Bruno Scherzinger
  • Patent number: 7729516
    Abstract: An image size changing section obtains a size changed image by changing the size of one of two original images captured by a pair of cameras. If an edge of an object has many oblique components, the edge is difficult to detect as a vertical edge but, when the image is horizontally reduced, an oblique edge becomes close to a vertical edge. For this reason, feature end points are extracted with reliability by a feature extracting section and, thereby, an object is recognized and the distance to the object is determined reliably.
    Type: Grant
    Filed: August 16, 2007
    Date of Patent: June 1, 2010
    Assignee: Fujitsu Ten Limited
    Inventors: Nobukazu Shima, Akihiro Oota, Kenji Oka
  • Publication number: 20100103258
    Abstract: A method for determining a relative position of a first camera with respect to a second camera, comprises the followings steps: Determining at least a first, a second and a third position of respective reference points with respect to the first camera, Determining at least a first, a second and a third distance of said respective reference points with respect to the second camera, Calculating the relative position of the second camera with respect to the first camera using at least the first to the third positions and the first to the third distances.
    Type: Application
    Filed: March 17, 2008
    Publication date: April 29, 2010
    Applicant: NXP, B.V.
    Inventors: Ivan Moise, Richard P. Kleihorst
  • Publication number: 20100103259
    Abstract: An object distance deriving device comprises: a compound-eye imaging device having imaging units with optical lenses randomly arranged for the respective imaging units; and a distance calculation unit to calculate an object distance using images captured by the compound-eye imaging device. The distance calculation unit: sets temporary distances z (S1); calculates an imaging process matrix [Hz] according to a temporary distance z (S2); estimates a high-resolution image by super-resolution processing using the imaging process matrix [Hz] (S3); uses the estimated high-resolution image to calculate an evaluation value distribution E for evaluating the temporary distance z (S4); repeats steps S2 to S4 for all temporary distances z (S5); and determines, as an object distance, one temporary distance z giving a minimum evaluation value in the evaluation value distributions E. This makes it possible to accurately derive the object distance even if the baseline length of the compound-eye imaging device is limited.
    Type: Application
    Filed: October 20, 2009
    Publication date: April 29, 2010
    Applicants: Funai Electric Co., Ltd., Osaka University
    Inventors: Jun TANIDA, Ryoichi Horisaki, Yoshizumi Nakao, Takashi Toyoda, Yasuo Masaki
  • Patent number: 7705887
    Abstract: A terminal device 1 including a wireless communication unit 11 is carried by a subject user who will become a subject, and a wireless communication unit 21 for performing directional data communication in an imaging direction is provided in a camera 2. When the terminal device 1 and the camera 2 become able to communicate data each other, the subject user carrying the terminal device 1 is photographed, and image data obtained in this way are transmitted to the terminal device 1. The terminal device 1 displays the image data. If necessary, the subject user issues an instruction to print the image data, and transmits information on the instruction to print to the camera 2. The camera 2 assigns the information on the instruction to print to the image data, and transmits the information on the instruction to print and the image data to an image server 4 and further to a printer 5.
    Type: Grant
    Filed: December 5, 2003
    Date of Patent: April 27, 2010
    Assignee: Fujifilm Corporation
    Inventor: Hisayoshi Tsubaki
  • Publication number: 20100076683
    Abstract: A car and ship blind spot-free collision avoidance system includes a plurality of image acquiring units and detecting units externally spaced along the car or the ship, an image processing unit, at least one image display unit, and a power supply unit, all being electrically connected to a control unit. The power supply unit supplies power needed by the system to operate normally. The control unit can determine the car or the ship position and moving direction via gear position of a shift lever and turning direction of a steering wheel, and enable real-time images shot by the image acquiring units corresponding to that position or direction to be processed by the image processing unit and then synchronously displayed on a split screen of the image display unit, so as to eliminate blind spots in driving the car or steering the ship and minimize collision and other accidents.
    Type: Application
    Filed: September 25, 2008
    Publication date: March 25, 2010
    Applicant: TECH-CAST MFG CORP.
    Inventor: RONG-HER CHOU
  • Patent number: 7657121
    Abstract: A method for simultaneously capturing images of multiple areas is applied to an image processing device. The method includes scanning a document and obtaining a preview image of the document; providing N scan windows for selecting N to-be-scanned areas from the preview image, wherein N is an integer; setting N scan resolutions respectively for the N scan windows; scanning the document; and outputting N scan images corresponding to the N scan windows according to N frequencies of a timing signal corresponding to the N scan resolutions.
    Type: Grant
    Filed: August 18, 2006
    Date of Patent: February 2, 2010
    Assignee: Avision Inc.
    Inventor: Shing-Chia Chen
  • Patent number: 7629994
    Abstract: A quantum nanodot processing system, comprising: at least one image capture camera configured to capture scenes including actors and/or objects in a visible band; and at least one marker capture camera configured to capture motions of the actors and/or objects applied with at least one quantum nanodot (QD) marker, wherein the at least one marker capture camera is tuned to capture narrowband IR signals emitted by the at least one QD marker.
    Type: Grant
    Filed: July 11, 2007
    Date of Patent: December 8, 2009
    Assignees: Sony Corporation, Sony Pictures Entertainment Inc.
    Inventor: Bruce Dobrin