Robotics Patents (Class 382/153)
  • Patent number: 8879822
    Abstract: A robot control system includes a processing unit which performs visual servoing based on a reference image and a picked-up image, a robot control unit which controls a robot based on a control signal, and a storage unit which stores the reference image and a marker. The storage unit stores, as the reference image, a reference image with marker in which the marker is set in an area of a workpiece or a hand of the robot. The processing unit generates, based on the picked-up image, a picked-up image with marker in which the marker is set in an area of the workpiece or the hand of the robot, performs visual servoing based on the reference image with marker and the picked-up image with marker, generates the control signal, and outputs the control signal to the robot control unit.
    Type: Grant
    Filed: May 15, 2012
    Date of Patent: November 4, 2014
    Assignee: Seiko Epson Corporation
    Inventor: Shigeyuki Matsumoto
  • Patent number: 8873832
    Abstract: The present invention relates to a slip detection apparatus and method for a mobile robot, and more particularly, to a slip detection apparatus and method for a mobile robot, which not only use a plurality of rotation detection sensors to detect a lateral slip angle and lateral slip direction, but also analyze the amount of change in an image and detect the blocked degree of an image input unit to determine the quality of an input image, and detect the occurrence of a frontal slip to precisely detect the type of slip, direction of the slip, and the rotation angle, and, on the basis of the latter, to enable the mobile robot to move away from and avoid slip regions, and to reassume the precise position thereof.
    Type: Grant
    Filed: October 30, 2009
    Date of Patent: October 28, 2014
    Assignee: Yujin Robot Co., Ltd.
    Inventors: Kyung Chul Shin, Seong Ju Park, Hee Kong Lee, Jae Young Lee, Hyung O Kim, James Stonier Daniel
  • Patent number: 8873831
    Abstract: A walking robot and a simultaneous localization and mapping method thereof in which odometry data acquired during movement of the walking robot are applied to image-based SLAM technology so as to improve accuracy and convergence of localization of the walking robot. The simultaneous localization and mapping method includes acquiring image data of a space about which the walking robot walks and rotational angle data of rotary joints relating to walking of the walking robot, calculating odometry data using kinematic data of respective links constituting the walking robot and the rotational angle data, and localizing the walking robot and mapping the space about which the walking robot walks using the image data and the odometry data.
    Type: Grant
    Filed: December 15, 2011
    Date of Patent: October 28, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Sung Hwan Ahn, Kyung Shik Roh, Suk June Yoon, Seung Yong Hyung
  • Publication number: 20140314306
    Abstract: Provided are a robot for managing a structure, and a method of controlling the robot. The robot for maintaining and repairing the structure measures a luminance value by capturing an image of the structure, or measures depth information of the structure by using a laser sensor or stereo vision, determines a protruding portion or depressed portion of the structure by using the measured luminance value or the measured depth information. Also, the robot removes the determined protruding portion and fills the determined depressed portion by using a combination hardener. Accordingly, protrusion, depression, and crack of a wall caused by deterioration or poor construction of the structure may be automatically found and repaired so as to efficiently manage the structure.
    Type: Application
    Filed: October 10, 2013
    Publication date: October 23, 2014
    Applicant: Daegu Gyeongbuk Institute of Science and Technology
    Inventors: Seung Yeol LEE, Jeon Il MOON
  • Patent number: 8867819
    Abstract: Mechanisms are provided for determining the physical location of a physical asset in a physical area. A plurality of physical assets are controlled to cause each physical asset to output a visual output pattern on visual output elements of the physical asset. An image of a target physical asset is captured that has the current state of the visual output elements. An identification of the target physical asset is determined based on the current state of the visual output elements. A physical location of the target physical asset is determined based on a physical location of the image capture device when the image was captured. Location data identifying the determined physical location of the target physical asset is stored in an asset database in association with configuration information for the physical asset.
    Type: Grant
    Filed: January 28, 2013
    Date of Patent: October 21, 2014
    Assignee: International Business Machines Corporation
    Inventors: Robert J. Calio, Jonathan H. Connell, II, Michael J. Frissora, Canturk Isci, Jeffrey O. Kephart, Jonathan Lenchner, Suzanne K. McIntosh, Iqbal I. Mohomed, John C. Nelson, James W. Thoensen
  • Patent number: 8861785
    Abstract: An information processing device, including: a three-dimensional information generating section for obtaining position and attitude of a moving camera or three-dimensional positions of feature points by successively receiving captured images from different viewpoints, and updating status data using observation information which includes tracking information of the feature points, the status data including three-dimensional positions of the feature points within the images and position and attitude information of the camera; and a submap generating section for generating submaps by dividing an area for which the three-dimensional position is to be calculated.
    Type: Grant
    Filed: August 12, 2010
    Date of Patent: October 14, 2014
    Assignee: Sony Corporation
    Inventors: Kenichiro Oi, Masaki Fukuchi
  • Patent number: 8855406
    Abstract: A system and method are disclosed for estimating camera motion of a visual input scene using points and lines detected in the visual input scene. The system includes a camera server comprising a stereo pair of calibrated cameras, a feature processing module, a trifocal motion estimation module and an optional adjustment module. The stereo pair of the calibrated cameras and its corresponding stereo pair of camera after camera motion form a first and a second trifocal tensor. The feature processing module is configured to detect points and lines in the visual input data comprising a plurality of image frames. The feature processing module is further configured to find point correspondence between detected points and line correspondence between detected lines in different views. The trifocal motion estimation module is configured to estimate the camera motion using the detected points and lines associated with the first and the second trifocal tensor.
    Type: Grant
    Filed: August 26, 2011
    Date of Patent: October 7, 2014
    Assignee: Honda Motor Co., Ltd.
    Inventors: Jongwoo Lim, Vivek Pradeep
  • Patent number: 8855819
    Abstract: A SLAM of a robot is provided. The position of a robot and the position of feature data may be estimated by acquiring an image of the robot's surroundings, extracting feature data from the image, and matching the extracted feature data with registered feature data. Furthermore, measurement update is performed in a camera coordinate system and an appropriate assumption is added upon coordinate conversion, thereby reducing non-linear components and thus improving the SLAM performance.
    Type: Grant
    Filed: July 22, 2009
    Date of Patent: October 7, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Ki-wan Choi, Hyoung-ki Lee, Ji-young Park
  • Patent number: 8855404
    Abstract: Methods and systems for inspecting a workpiece are provided. The method includes storing model data associated with the workpiece in an inspection system, determining a relative position of a depth sensing device relative to the workpiece, and calibrating a pose view for the inspection system relative to the model based on the position of the depth sensing device relative to the workpiece. The method further includes measuring actual depth distance data of at least one pixel of the depth sensing device relative to the workpiece and determining, based on the actual depth distance data, if the workpiece satisfies predetermined inspection criteria.
    Type: Grant
    Filed: August 27, 2012
    Date of Patent: October 7, 2014
    Assignee: The Boeing Company
    Inventors: Joseph D. Doyle, Paul R. Davies
  • Publication number: 20140294286
    Abstract: A three-dimensional measurement method three-dimensionally restores an edge having a cross angle close to parallel to an epipolar line. Edges e2L, e3L and e4L on the same plane of a work are selected, and among these edges, at least two edges e2L and e4L residing in a predetermined angle range with reference to the cross angle of 90° crossing the epipolar line are three-dimensionally restored by a stereo method. Then, a three-dimensional plane P1 including these three-dimensionally restored e2L and e4L is found, and the edge e3L residing beyond a predetermined angle range with reference to the cross angle of 90° crossing the epipolar line is projected to this three-dimensional plane P1, thus three-dimensionally restoring the edge e3L.
    Type: Application
    Filed: November 29, 2012
    Publication date: October 2, 2014
    Inventor: Hiroshi Kitajima
  • Patent number: 8849036
    Abstract: The present invention relates to a map generating and updating method for mobile robot position recognition, and more specifically relates to a map generating and updating method for mobile robot position recognition, whereby position recognition error can be minimized by registering landmarks extracted during map generation and landmarks extracted on the basis of the probable error in inferred landmarks, calculating the accuracy of landmarks pre-registered during map generation, and adjusting the level of landmarks of low accuracy or removing landmarks which have been registered erroneously.
    Type: Grant
    Filed: October 30, 2009
    Date of Patent: September 30, 2014
    Assignee: Yujin Robot Co., Ltd.
    Inventors: Kyung Chul Shin, Seong Ju Park, Hee Kong Lee, Jae Young Lee, Hyung O Kim, James Stonier Daniel
  • Patent number: 8842906
    Abstract: A method of generating three dimensional body data of a subject is described. The method includes capturing one or more images of the subject using a digital imaging device and generating three dimensional body data of the subject based on the one or more images.
    Type: Grant
    Filed: August 8, 2012
    Date of Patent: September 23, 2014
    Assignee: Poikos Limited
    Inventors: Eleanor Watson, David Evans
  • Patent number: 8831353
    Abstract: A color determination unit determines color information of a light-emitting body of an input device. A transmitter unit communicates the determined color information to the input device. A recording unit records a history of the color information determined by the color determination unit. A color candidate determination unit determines one or more candidates of emitted color of the light-emitting body, using the color information recorded in the recording unit. An acknowledging unit acknowledges from the user a command to determine a candidate of emitted light, and the color determination unit determines the color information of the light-emitting body accordingly.
    Type: Grant
    Filed: June 10, 2011
    Date of Patent: September 9, 2014
    Assignees: Sony Corporation, Sony Computer Entertainment Inc.
    Inventor: Yoshio Miyazaki
  • Patent number: 8831872
    Abstract: An apparatus and method for estimating a location of a mobile body and generating a map of a mobile body environment using an upper image of the mobile body environment, and a computer readable recording medium storing a computer program for controlling the apparatus. The apparatus includes: a landmark generating unit observing corner points from the upper image obtained by photographing in an upper vertical direction in the mobile body environment and respectively generating landmarks from the observed corner points; and a location and map operation unit estimating the location of the mobile body and generating the map of the mobile body environment from the landmarks.
    Type: Grant
    Filed: January 13, 2006
    Date of Patent: September 9, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Sungi Hong, Hyoungki Lee, Hyeon Myeong
  • Patent number: 8824775
    Abstract: Disclosed herein are a feature point used to localize an image-based robot and build a map of the robot and a method of extracting and matching an image patch of a three-dimensional (3D) image, which is used as the feature point. It is possible to extract the image patch converted into the reference image using the position information of the robot and the 3D position information of the feature point. Also, it is possible to obtain the 3D surface information with the brightness values of the image patches to obtain the match value with the minimum error by a 3D surface matching method of matching the 3D surface information of the image patches converted into the reference image through the ICP algorithm.
    Type: Grant
    Filed: December 11, 2009
    Date of Patent: September 2, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Suk June Yoon, Kyung Shik Roh, Woong Kwon, Seung Yong Hyung, Hyun Kyu Kim, Sung Hwan Ahn
  • Patent number: 8824776
    Abstract: A collision detection system includes a processing section, a drawing section, and a depth buffer. Depth information of an object is set to the depth buffer as depth map information. The drawing section performs a first drawing process of performing a depth test, and drawing a primitive surface on a reverse side when viewed from a predetermined viewpoint out of primitive surfaces constituting a collision detection target object with reference to the depth buffer. Further, the drawing section performs a second drawing process of drawing the primitive surface on the reverse side when viewed from a predetermined viewpoint out of the primitive surfaces constituting the collision detection target object without performing the depth test. The processing section determines whether or not the collision detection target object collides with the object on the target side based on the result of the first drawing process and the second drawing process.
    Type: Grant
    Filed: April 18, 2012
    Date of Patent: September 2, 2014
    Assignee: Seiko Epson Corporation
    Inventor: Mitsuhiro Inazumi
  • Patent number: 8824777
    Abstract: There is provided a method of post-correction of a 3D feature point-based direct teaching trajectory, which improves direct teaching performance by extracting shape-based feature points based on curvature and velocity and improving a direct teaching trajectory correction algorithm using the shape-based feature points. Particularly, there is provided a method of post-correction of a 3D feature point-based direct teaching trajectory, which makes it possible to extract and post-correct a 3D (i.e., spatial) trajectory, as well as a 2D (i.e., planar) trajectory, with higher accuracy.
    Type: Grant
    Filed: July 6, 2012
    Date of Patent: September 2, 2014
    Assignee: Korea Institute of Machinery & Materials
    Inventors: Tae Yong Choi, Chan-Hun Park, Hyun Min Do, Jin-Ho Kyung
  • Patent number: 8805057
    Abstract: A structured light pattern including a set of patterns in a sequence is generated by initializing a base pattern. The base pattern includes a sequence of colored stripes such that each subsequence of the colored stripes is unique for a particular size of the subsequence. The base pattern is shifted hierarchically, spatially and temporally a predetermined number of times to generate the set of patterns, wherein each pattern is different spatially and temporally. A unique location of each pixel in a set of images acquired of a scene is determined, while projecting the set of patterns onto the scene, wherein there is one image for each pattern.
    Type: Grant
    Filed: July 31, 2012
    Date of Patent: August 12, 2014
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Yuichi Taguchi, Amit Agrawal, Oncel Tuzel
  • Patent number: 8798794
    Abstract: An object is highly precisely moved by an industrial robot to an end position by the following steps, which are repeated until the end position is reached within a specified tolerance: Recording a three-dimensional image by means of a 3-D image recording device. Determining the present position of the object in the spatial coordinate system from the position of the 3-D image recording device the angular orientation of the 3-D image recording device detected by an angle measuring unit, the three-dimensional image, and the knowledge of features on the object. Calculating the position difference between the present position of the object and the end position. Calculating a new target position of the industrial robot while taking into consideration the compensation value from the present position of the industrial robot and a value linked to the position difference. Moving the industrial robot to the new target position.
    Type: Grant
    Filed: May 26, 2010
    Date of Patent: August 5, 2014
    Assignee: Leica Geosystems AG
    Inventors: Bernd Walser, Bernhard Metzler, Beat Aebischer, Knut Siercks, Bo Pettersson
  • Publication number: 20140212025
    Abstract: A registration system and method includes a configurable device (104) having one or more moveable features (122) such that movement of the moveable features can be determined relative to a reference to define a specific configuration of the configurable device. An imaging system (110) has a display on which the configurable device is viewable. A processing device (112) is configured to register the configurable device with a coordinate system of the imaging system based on the specific configuration of the configurable device.
    Type: Application
    Filed: September 5, 2012
    Publication date: July 31, 2014
    Applicant: KONINKLIJKE PHILIPS ELECTRONICS N.V.
    Inventors: Paul Thienphrapa, Bharat Ramachandran, Aleksandra Popovic
  • Patent number: 8792709
    Abstract: Systems and methods for transprojection of geometry data acquired by a coordinate measuring machine (CMM). The CMM acquires geometry data corresponding to 3D coordinate measurements collected by a measuring probe that are transformed into scaled 2D data that is transprojected upon various digital object image views captured by a camera. The transprojection process can utilize stored image and coordinate information or perform live transprojection viewing capabilities in both still image and video modes.
    Type: Grant
    Filed: June 21, 2012
    Date of Patent: July 29, 2014
    Assignee: Hexagon Metrology, Inc.
    Inventors: Sandeep Pulla, Homer Eaton
  • Patent number: 8787614
    Abstract: A system building a map while an image sensor is moving, the system including the image sensor configured to capture images while the image sensor moves relative to one or more different locations, a sub-map building unit configured to recognize a relative location for at least the image sensor of the system using the captured images, build up a sub-map, and if a condition for stopping a building of the sub-map is met, store the sub-map which has been so far built up, an operation determining unit configured to determine whether the condition for stopping building the sub-map, an image group storing unit configured to store an image group including images that are newly captured from the image sensor after the storing of the sub-map when the condition for the stopping of the building of the sub-map is satisfied, and an overall map building unit configured to build an overall map based on the built sub-map and the stored image group when a current relative location for at least the image sensor of the system i
    Type: Grant
    Filed: May 3, 2011
    Date of Patent: July 22, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Hyun-Do Choi, Woo-Yeon Jeong
  • Patent number: 8781254
    Abstract: A method of data processing is provided for estimating a position of an object in an image from a position of a reference object in a reference image. The method includes learning the position of the reference object in the reference image and its relation to a set of reference landmarks in the reference image, accessing the image, accessing the relation between the position of the reference object and the set of the reference landmarks, identifying a set of landmarks in the image corresponding to the set of the reference landmarks, and applying the relation to the set of landmarks in the image for estimating the position of the object in the image.
    Type: Grant
    Filed: November 26, 2007
    Date of Patent: July 15, 2014
    Assignee: Koninklijke Philips N.V.
    Inventors: Stewart Young, Daniel Bystrov, Thomas Netsch, Michael Kaus, Vladimir Pekar
  • Patent number: 8781164
    Abstract: Disclosed are a mobile robot and a controlling method of the same. The mobile robot is capable of reducing a position recognition error and performing a precise position recognition even in the occurrence of a change of external illumination, through geometric constraints, when recognizing its position with using a low quality camera (e.g., camera having a low resolution). Furthermore, feature points may be extracted from images detected with using a low quality camera, and the feature points may be robustly matched with each other even in the occurrence of a change of external illumination, through geometric constraints due to the feature lines. This may enhance the performance of the conventional method for recognizing a position based on a camera susceptible to a illumination change, and improve the efficiency of a system.
    Type: Grant
    Filed: August 26, 2011
    Date of Patent: July 15, 2014
    Assignee: LG Electronics Inc.
    Inventors: Seongsu Lee, Sangik Na, Yiebin Kim, Seungmin Baek
  • Patent number: 8768514
    Abstract: An image taking system including: (a) a lighting device capable of changing a light emission time to various time length values; (b) an image taking device configured to take an image of a subject portion while light is being emitted by the lighting device; (c) a subject-portion moving device configured to move the subject portion relative to the image taking device, and capable of changing a movement velocity of the subject portion relative to the image taking device, to various velocity values; and (d) a control device configured, during movement of the subject portion by the subject-portion moving device, to cause the lighting device to emit the light for one of the time length values as the light emission time and to cause the image taking device to take the image, and is configured to control the movement velocity, such that an amount of the movement of the subject portion for the above-described one of the time length values is not larger than a predetermined movement amount.
    Type: Grant
    Filed: April 26, 2010
    Date of Patent: July 1, 2014
    Assignee: Fuji Machine Mfg. Co., Ltd.
    Inventor: Kazumi Hoshikawa
  • Patent number: 8755591
    Abstract: A method of classifying and collecting feature information of an area according to a robot's moving path, a robot controlled by area features, and a method and apparatus for composing a user interface using the area features are disclosed. The robot includes a plurality of sensor modules to collect feature information of a predetermined area along a moving path of the robot, and an analyzer to analyze the collected feature information of the predetermined area according to a predetermined reference range and to classify the collected feature information into a plurality of groups.
    Type: Grant
    Filed: June 4, 2013
    Date of Patent: June 17, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Seung-Nyung Chung, Hyun-jeong Lee, Hyun-jin Kim, Hyeon Myeong
  • Patent number: 8755590
    Abstract: The technology disclosed relates to scanning of large flat substrates for reading and writing images. Examples are flat panel displays, PCB's and photovoltaic panels. Reading and writing is to be understood in a broad sense: reading may mean microscopy, inspection, metrology, spectroscopy, interferometry, scatterometry, etc. of a large workpiece, and writing may mean exposing a photoresist, annealing by optical heating, ablating, or creating any other change to the surface by an optical beam. In particular, we disclose a technology that uses a rotating or swinging arm that describes an arc across a workpiece as it scans, instead of following a traditional straight-line motion.
    Type: Grant
    Filed: May 13, 2013
    Date of Patent: June 17, 2014
    Assignee: Micronic Laser Systems AB
    Inventors: Torbjorn Sandstrom, Sten Lindau
  • Publication number: 20140161345
    Abstract: Methods and robots for adjusting object detection parameters, object recognition parameters, or both object detection parameters and object recognition parameters are disclosed. Methods include receiving image data, automatically recognizing an object with an object recognition module based on the image data, determining whether a pose estimation error has occurred, and adjusting at least one object recognition parameter when the pose estimation error has occurred. Methods include receiving image data and automatically detecting a candidate object with an object detection module based on the image data, recognizing an object with an object recognition module based on the detected candidate object, determining whether an object recognition error has occurred, and adjusting the at least one object detection parameter when the object recognition error has occurred.
    Type: Application
    Filed: December 6, 2012
    Publication date: June 12, 2014
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventor: Toyota Motor Engineering & Manufacturing North America, Inc
  • Patent number: 8744215
    Abstract: Methods and systems to improve operator control of mobile robots are disclosed. The invention comprises in various embodiments the aggregation of multiple image feeds to improve operator situational awareness and the dynamic selection of command reference frames to improve operator intuitive control. The disclosed methods and systems reduce operator workload, reduce task completion times, and extend the capabilities of mobile manipulation systems.
    Type: Grant
    Filed: June 27, 2011
    Date of Patent: June 3, 2014
    Assignee: American Android Corp.
    Inventors: David A. Handelman, Haldun Komsuoglu, Gordon H. Franken
  • Patent number: 8744665
    Abstract: The present invention relates to a control method for the localization and navigation of a mobile robot and a mobile robot using the same. More specifically, the localization and navigation of a mobile robot are controlled using inertial sensors and images, wherein local direction descriptors are employed, the mobile robot is changed in the driving mode thereof according to the conditions of the mobile robot, and errors in localization may be minimized.
    Type: Grant
    Filed: July 28, 2009
    Date of Patent: June 3, 2014
    Assignee: Yujin Robot Co., Ltd.
    Inventors: Kyung Chul Shin, Seong Ju Park, Hee Kong Lee, Jae Young Lee, Hyung O Kim
  • Patent number: 8731276
    Abstract: A motion space presentation device includes: a work area generation unit configured to generate a three-dimensional region in which the movable robot operates; an image capture unit configured to capture a real image; a position and posture detection unit configured to detect an image capture position and an image capture direction of the image capture unit; and an overlay unit configured to selectively superimpose either an image of a segment approximation model of the movable robot as viewed in the image capture direction from the image capture position, or an image of the three-dimensional region as viewed in the image capture direction from the image capture position, on the real image captured by the image capture unit, according to the difficulty in recognizing each image.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: May 20, 2014
    Assignee: Panasonic Corporation
    Inventors: Kenji Mizutani, Taichi Sato
  • Patent number: 8724884
    Abstract: A controller is provided for maneuvering an interrogation plane relative to a reference surface. The interrogation plane intersects the reference surface and is associated with a pole about which the interrogation plane is rotatable. The pole has an adjustable angle of inclination relative to the reference surface. The controller has a base, a handle, and an arm extending from a connection with the base to a connection with the handle. The arm has hingable joints which hinge to allow the handle to be translated relative to the base in arbitrary directions across a user surface. The controller further has position sensors which measure the translation of the handle relative to the base on the user surface. The translation provides a corresponding translation of the interrogation plane relative to the reference surface. The handle is rotatable about a rotation axis and is tillable about a tilt axis to allow the handle to be angularly moved relative to the user surface.
    Type: Grant
    Filed: January 27, 2011
    Date of Patent: May 13, 2014
    Assignee: Cambridge Enterprise Limited
    Inventors: David J. Lomas, Martin J. Graves, Daniel Peterson Godfrey, David Seymour Warwick
  • Patent number: 8705842
    Abstract: A robot cleaner and a method for controlling the same are provided. A region to be cleaned may be divided into a plurality of sectors based on detection data collected by a detecting device, and a partial map for each sector may be generated. A full map of the cleaning region may then be generated based on a position of a partial map with respect to each sector, and a topological relationship between the partial maps. Based on the full map, the robot cleaner may recognize its position, allowing the entire region to be completely cleaned, and allowing the robot cleaner to rapidly move to sectors that have not yet been cleaned.
    Type: Grant
    Filed: October 26, 2011
    Date of Patent: April 22, 2014
    Assignee: LG Electronics Inc.
    Inventors: Tae-Kyeong Lee, Seongsu Lee, Seungmin Baek, Sangik Na, Se-Young Oh, Sanghoon Baek, Kwangro Joo, Jeongsuk Yoon, Yiebin Kim
  • Patent number: 8706301
    Abstract: Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown.
    Type: Grant
    Filed: January 8, 2013
    Date of Patent: April 22, 2014
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Wenyi Zhao, Tao Zhao, David Q. Larkin
  • Patent number: 8693729
    Abstract: The present invention creates and stores target representations in several coordinate representations based on biologically inspired models of the human vision system. By using biologically inspired target representations a computer can be programmed for robot control without using kinematics to relate a target position in camera eyes to a target position in body or head coordinates. The robot sensors and appendages are open loop controlled to focus on the target. In addition, the invention herein teaches a scenario and method to learn the mappings between coordinate representations using existing machine learning techniques such as Locally Weighted Projection Regression.
    Type: Grant
    Filed: August 14, 2012
    Date of Patent: April 8, 2014
    Assignee: HRL Laboratories, LLC
    Inventors: Paul Alex Dow, Deepak Khosla, David J Huber
  • Patent number: 8687900
    Abstract: Disclosed is an image processing apparatus, method and computer-readable medium of a robot which efficiently manage a moving image acquired by the robot and reinforce security to prevent image leakage. In order to restore an object region within an original image photographed by the robot, a low-resolution image is generated using object password information and the original image to generate image information, and the image information is transmitted to a server over a network. The server detects the object password information and the low-resolution information from the image information and restores a high-resolution object image using the object password information and the low-resolution image.
    Type: Grant
    Filed: September 7, 2010
    Date of Patent: April 1, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Won Jun Hwang, Woo Sup Han
  • Publication number: 20140064601
    Abstract: Vision based tracking of a mobile device is used to remotely control a robot. For example, images captured by a mobile device, e.g., in a video stream, are used for vision based tracking of the pose of the mobile device with respect to the imaged environment. Changes in the pose of the mobile device, i.e., the trajectory of the mobile device, are determined and converted to a desired motion of a robot that is remote from the mobile device. The robot is then controlled to move with the desired motion. The trajectory of the mobile device is converted to the desired motion of the robot using a transformation generated by inverting a hand-eye calibration transformation.
    Type: Application
    Filed: September 5, 2012
    Publication date: March 6, 2014
    Applicant: QUALCOMM Incorporated
    Inventors: Mahesh Ramachandran, Christopher Brunner, Arvind Ramanandan, Abhishek Tyagi, Murali Ramaswamy Chari
  • Patent number: 8666141
    Abstract: A robot system includes a robot having a movable section, an image capture unit provided on the movable section, an output unit that allows the image capture unit to capture a target object and a reference mark and outputs a captured image in which the reference mark is imaged as a locus image, an extraction unit that extracts the locus image from the captured image, an image acquisition unit that performs image transformation on the basis of the extracted locus image by using the point spread function so as to acquire an image after the transformation from the captured image, a computation unit that computes a position of the target object on the basis of the acquired image, and a control unit that controls the robot so as to move the movable section toward the target object in accordance with the computed position.
    Type: Grant
    Filed: October 15, 2012
    Date of Patent: March 4, 2014
    Assignee: Seiko Epson Corporation
    Inventor: Mitsuhiro Inazumi
  • Patent number: 8660365
    Abstract: Systems and methods for processing extracted plane features are provided. In one embodiment, a method for processing extracted plane features includes: estimating an area of each plane of a plurality of planes extracted from data collected by an imaging sensor; generating a list of detected planes including the area of each plane; filtering the list of detected planes to produce a list of candidates for merger, filtering the list of detected planes discarding any plane not satisfying an actual points received criteria; applying a primary merge algorithm to the list of candidates for merger that iteratively produces a list of merged planes by testing hypothetical merged planes against a merging criteria, the hypothetical merged planes each comprising a plane from the list of merged planes and a plane from the list of candidates for merger; and outputting a final list of planes.
    Type: Grant
    Filed: July 29, 2010
    Date of Patent: February 25, 2014
    Assignee: Honeywell International Inc.
    Inventors: Jan Lukas, Sara Susca, Ondrej Kotaba
  • Patent number: 8660697
    Abstract: A shape detection system includes a distance image sensor that detects an image of a plurality of detection objects and distances to the detection objects, the detection objects being randomly arranged in a container, a sensor controller that detects a position and an orientation of each of the detection objects in the container on the basis of the result of the detection performed by the distance image sensor and a preset algorithm, and a user controller that selects the algorithm to be used by the sensor controller and sets the algorithm for the sensor controller.
    Type: Grant
    Filed: June 18, 2010
    Date of Patent: February 25, 2014
    Assignee: Kabushiki Kaisha Yaskawa Denki
    Inventors: Hiroyuki Handa, Takeshi Arie, Yuji Ichimaru
  • Patent number: 8655094
    Abstract: A photogrammetry system and method provide for determining the relative position between two objects. The system utilizes one or more imaging devices, such as high speed cameras, that are mounted on a first body, and three or more photogrammetry targets of a known location on a second body. The system and method can be utilized with cameras having fish-eye, hyperbolic, omnidirectional, or other lenses. The system and method do not require overlapping fields-of-view if two or more cameras are utilized. The system and method derive relative orientation by equally weighting information from an arbitrary number of heterogeneous cameras, all with non-overlapping fields-of-view. Furthermore, the system can make the measurements with arbitrary wide-angle lenses on the cameras.
    Type: Grant
    Filed: May 11, 2011
    Date of Patent: February 18, 2014
    Assignee: The United States of America as represented by the Administrator of the National Aeronautics and Space Administration
    Inventors: Samuel A Miller, Kurt Severance
  • Patent number: 8649557
    Abstract: Disclosed herein is a computer-readable medium and method of a mobile platform detecting and tracking dynamic objects in an environment having the dynamic objects. The mobile platform acquires a three-dimensional (3D) image using a time-of-flight (TOF) sensor, removes a floor plane from the acquired 3D image using a random sample consensus (RANSAC) algorithm, and individually separates objects from the 3D image. Movement of the respective separated objects is estimated using a joint probability data association filter (JPDAF).
    Type: Grant
    Filed: August 16, 2010
    Date of Patent: February 11, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Seung Yong Hyung, Sung Hwan Ahn, Kyung Shik Roh, Suk June Yoon
  • Publication number: 20140037146
    Abstract: A structured light pattern including a set of patterns in a sequence is generated by initializing a base pattern. The base pattern includes a sequence of colored stripes such that each subsequence of the colored stripes is unique for a particular size of the subsequence. The base pattern is shifted hierarchically, spatially and temporally a predetermined number of times to generate the set of patterns, wherein each pattern is different spatially and temporally. A unique location of each pixel in a set of images acquired of a scene is determined, while projecting the set of patterns onto the scene, wherein there is one image for each pattern.
    Type: Application
    Filed: July 31, 2012
    Publication date: February 6, 2014
    Inventors: Yuichi Taguchi, Amit Agrawal, Oncel Tuzel
  • Publication number: 20140016856
    Abstract: Given an image and an aligned depth map of an object, the invention predicts the 3D location, 3D orientation and opening width or area of contact for an end of arm tooling (EOAT) without requiring a physical model.
    Type: Application
    Filed: May 13, 2013
    Publication date: January 16, 2014
    Applicant: Cornell University
    Inventors: Yun Jiang, John R. Amend, JR., Hod Lipson, Ashutosh Saxena, Stephen Moseson
  • Patent number: 8630489
    Abstract: A system described herein includes a receiver component that receives a first image and a symmetry signature generator component that generates a first global symmetry signature for the image, wherein the global symmetry signature is representative of symmetry existent in the first image. The system also includes a comparer component that compares the first global symmetry signature with a second global symmetry signature that corresponds to a second image, wherein the second global symmetry signature is representative of symmetry existent in the second image. The system additionally includes an output component that outputs an indication of similarity between the first image and the second image based at least in part upon the comparison undertaken by the comparer component.
    Type: Grant
    Filed: May 5, 2009
    Date of Patent: January 14, 2014
    Assignee: Microsoft Corporation
    Inventor: Georgios Chrysanthakopoulos
  • Patent number: 8630478
    Abstract: Disclosed are methods and apparatus for automatic optoelectronic detection and inspection of objects, based on capturing digital images of a two-dimensional field of view in which an object to be detected or inspected may be located, analyzing the images, and making and reporting decisions on the status of the object. Decisions are based on evidence obtained from a plurality of images for which the object is located in the field of view, generally corresponding to a plurality of viewing perspectives. Evidence that an object is located in the field of view is used for detection, and evidence that the object satisfies appropriate inspection criteria is used for inspection. Methods and apparatus are disclosed for capturing and analyzing images at high speed so that multiple viewing perspectives can be obtained for objects in continuous motion.
    Type: Grant
    Filed: September 20, 2012
    Date of Patent: January 14, 2014
    Assignee: Cognex Technology and Investment Corporation
    Inventor: William M. Silver
  • Patent number: 8625880
    Abstract: An apparatus and method are disclosed for setting up a vision system having a camera and a vision processor cooperative with the camera. The apparatus includes a gesture recognizer, a key recognizer, a breakout box having at least two signaling elements, and a setup control unit that is cooperative with the gesture recognizer, the key recognizer, and the breakout box. The combination of using a key and a gesture set as herein described is substantially superior, as compared with known user interfaces for setting up a vision system that has been previously been engineered, in terms of low-cost, convenience, ease-of-use, simplicity, and speed.
    Type: Grant
    Filed: May 17, 2011
    Date of Patent: January 7, 2014
    Assignee: Cognex Technology and Investment Corporation
    Inventors: William Silver, Robert Shillman, Aaron Wallack
  • Patent number: 8605983
    Abstract: A non-contact measurement apparatus and method. A probe is provided for mounting on a coordinate positioning apparatus, comprising at least one imaging device for capturing an image of an object to be measured. Also provided is an image analyzer configured to analyze at least one first image of an object obtained by the probe from a first perspective and at least one second image of the object obtained by the probe from a second perspective so as to identify at least one target feature on the object to be measured. The image analyzer is further configured to obtain topographical data regarding a surface of the object via analysis of an image, obtained by the probe, of the object on which an optical pattern is projected.
    Type: Grant
    Filed: August 15, 2008
    Date of Patent: December 10, 2013
    Assignee: Renishaw PLC
    Inventors: Nicholas John Weston, Yvonne Ruth Huddart
  • Patent number: 8588979
    Abstract: A mechanical robot can have a GPS receiver for localization, to enable it to navigate and/or perform location-specific functions. Also, the robot can be caused to ambulate in a location, taking pictures of guests and/or sounding an alarm if an unknown person is imaged by a camera on the robot. Further, the robot can be given a voice message for a recipient, and then ambulate around until, using face or voice recognition, it recognizes the intended recipient and delivers the message, e.g., aurally using a speaker.
    Type: Grant
    Filed: February 15, 2005
    Date of Patent: November 19, 2013
    Assignees: Sony Corporation, Sony Electronics Inc.
    Inventors: John David Decuir, Todd Gen Kozuki, Otis James Gates, Satoshi Orii
  • Patent number: RE45122
    Abstract: A non-contact passive ranging system wherein a first imager on a platform is focused on a first object and a second imager on the platform is also focused on the first object. The optical path from the first object to the first imager is configured to be shorter than the optical path from the object to the second imager. Processing circuitry is responsive to an output of the first imager and an output of the second imager as relative motion is provided between the platform and the first object and is configured to calculate the distance from the platform to the object.
    Type: Grant
    Filed: August 15, 2013
    Date of Patent: September 9, 2014
    Assignee: The Charles Stark Draper Laboratory, Inc.
    Inventors: Nicholas Zervoglos, Paul DeBitetto, Scott Rasmussen