Optical Patents (Class 901/47)
  • Patent number: 7952728
    Abstract: Disclosed is a robot-controlled optical measurement array (1) comprising an optical sensor (2) that is fastened to a spacer (3). Reference marks (22) are provided on the spacer (3) and/or on a sensor (2) housing (2?). Said optical measurement array (1) is calibrated by means of an auxiliary device (13) that is placed on the optical measurement array (1) and is provided with a sensor target (16) which is disposed on the auxiliary device so as to lie within one measurement space (17) of the optical sensor (2) when the optical measurement array (1) and the auxiliary device (13) are in the assembled state. In order to calibrate the optical measurement array (1), measured values of the sensor target (16) are generated with the aid of the sensors (2), said measured values being used for calculating the three-dimensional position of the sensor coordinate system (10) in relation to the sensor target (16).
    Type: Grant
    Filed: May 3, 2005
    Date of Patent: May 31, 2011
    Assignee: KUKA Roboter GmbH
    Inventors: Thomas Ibach, Bernhard Laubel, Matej Leskovar, Holger Linnenbaum, Martin Paskuda
  • Publication number: 20110107865
    Abstract: The invention provides a dairy animal treatment system with a controllable robot arm which is configured for positioning an animal treatment device, and with an object recognition device which comprises a controllable light source, a first 3D-sensor and a signal processing device for processing the supplied signals, wherein the object recognition device comprises a second 3D-sensor which is positioned at a distance from the first 3D-sensor, in particular at a horizontal distance. The respective central lines and preferably make an angle unequal to 0° with each other. The system according to the invention has advantages including less mutual concealment of objects in the respective image fields and, and a strongly increased total angle of view.
    Type: Application
    Filed: January 18, 2011
    Publication date: May 12, 2011
    Applicant: LELY PATENT N.V.
    Inventors: Karel VAN DEN BERG, Hélèna Geralda Maria VIJVERBERG
  • Publication number: 20110098859
    Abstract: A robot system includes a robot. A robot control device is configured to control an operation of the robot, and includes a workpiece shape memory configure to store a shape of workpieces. A shape sensor is configured to detect shape information about the workpieces. A target workpiece detector is configured to detect a graspable workpiece based on the shape information detected by the shape sensor. A grasping information memory is configured to store a grasping position indicating which portion of the graspable workpiece is to be grasped by the robot. A grasping operation controller is configured to control the robot to grasp the graspable workpiece detected by the target workpiece detector and to pick the grasped workpiece. A disturbing operation controller is configured to control, if no graspable workpiece is detected by the target workpiece detector, the robot to perform a workpiece disturbing operation.
    Type: Application
    Filed: October 25, 2010
    Publication date: April 28, 2011
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Toshimitsu IRIE, Yukio Hashiguchi, Shinji Murai
  • Patent number: 7933687
    Abstract: A moving object capable of recognizing an image and a system for directing the moving object are disclosed. The system for directing the moving object includes: a charging device on which a docking direction pattern is printed, such that a central point of the docking direction pattern for directing the moving object to a docking location and a power-supply terminal are arranged in a straight line; and a moving object for driving wheel operations to allow the central point of the docking direction pattern captured by a camera to be identical with a central point of an image captured by the camera, and moving to the docking location. The moving-object directing system drives a wheel to track a central point of the docking direction pattern captured by a camera, such that it can quickly and correctly move to the docking location of the charging device.
    Type: Grant
    Filed: July 20, 2006
    Date of Patent: April 26, 2011
    Assignee: LG Electronics Inc.
    Inventors: Oh-hyun Baek, Ho-seon Rew
  • Publication number: 20110093237
    Abstract: A device for handling a substantially circular wafer is provided. The device includes an interior accessible through a plurality of entrances, and a plurality of sensors consisting of two sensors for each one of the plurality of entrances, each sensor capable of detecting a presence of the substantially circular wafer, at a predetermined location within the interior, wherein the plurality of sensors are arranged so that at least two of the plurality of sensors detect the wafer for any position of the wafer entirely within the interior, wherein a first one of the two sensors is positioned to detect the wafer when the wafer has passed entirely into the interior through one of the plurality of entrances, and a second one of the two sensors is positioned immediately outside a diameter of the wafer when the wafer has passed entirely into the interior through one of the plurality of entrances.
    Type: Application
    Filed: December 23, 2010
    Publication date: April 21, 2011
    Applicant: BROOKS AUTOMATION, INC.
    Inventors: Christopher C. Kiley, Peter van der Meulen, Forrest T. Buzan, Paul E. Fogel
  • Publication number: 20110087360
    Abstract: An industrial robot is used to assemble a part to a predetermined location on a randomly moving workpiece. The workpiece may be an automobile on an assembly line and the part may be a wheel (a tire mounted on a rim) to be assembled on one of the wheel hubs of the automobile. The robot has mounted on it a camera, a force sensor and a gripper to grip the part. After the robot grips the part, signals from both the force sensor and vision are used by a computing device to move the robot to a position where the robot can assemble the part to the predetermined location on the workpiece. The computing device can be the robot controller or a separate device such as a PC that is connected to the controller.
    Type: Application
    Filed: March 30, 2009
    Publication date: April 14, 2011
    Inventors: Heping Chen, George Zhang, Thomas A. Fuhlbrigge
  • Publication number: 20110077800
    Abstract: Methods and systems are provided for triaging a plurality of targets with robotic vehicle while the robotic vehicle remains at a first location. The robotic vehicle is in operable communication with a remote command station and includes a processor that is coupled to a first imager. The first imager generates separate images of each one of the plurality of targets while the robotic vehicle remains at the first location. The processor receives target data identifying the plurality of targets from the remote command station, acquires an image of each one of the plurality of targets with the first imager while the robotic vehicle remains at the first location, and transmits each generated image to the remote command station.
    Type: Application
    Filed: July 14, 2009
    Publication date: March 31, 2011
    Applicant: RAYTHEON COMPANY
    Inventors: Eric M. Palmer, James N. Head, Robin Aileen Yingst, Aladin A. Sayed
  • Publication number: 20110074923
    Abstract: Disclosed herein is a system which transmits an image using a lossless compression method in a robot to provide a service over a network and a method thereof. The network-based robot separates an image acquired by a stereo camera into various image formats and transmits the various image formats to a service server. The service server synthesizes the separated image formats to suit an image request such as face recognition, object recognition, navigation or monitoring to restore and provide an original image. When the network-based robot transmits the separated image formats to the server, the original image is transmitted using the lossless method to improve the performance of the server. Even when a service using a new image is added, separated images are transmitted with respect to the image format requested by this service to more flexibly cope with the service using the new image. Since channels are separated in order to receive lossless data, network gain is obtained.
    Type: Application
    Filed: September 7, 2010
    Publication date: March 31, 2011
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Byung Kwon Choi, Woo Sup Han, Tae Sin Ha
  • Patent number: 7916931
    Abstract: An apparatus, method, and medium for dividing regions by using feature points and a mobile robot cleaner using the same are provided. A method includes forming a grid map by using a plurality of grid points that are obtained by detecting distances of a mobile robot from obstacles; extracting feature points from the grid map; extracting candidate pairs of feature points, which are in the range of a region division element, from the feature points; extracting a final pair of feature points, which satisfies the requirements of the region division element, from the candidate pairs of feature points; forming a critical line by connecting the final pair of feature points; and forming a final region in accordance with the size relationship between regions formed of a closed curve which connects the critical line and the grid map.
    Type: Grant
    Filed: July 5, 2007
    Date of Patent: March 29, 2011
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Su-jinn Lee, Hyeon Myeong, Yong-beom Lee, Seok-won Bang
  • Publication number: 20110071675
    Abstract: A robotic system includes a humanoid robot with robotic joints each moveable using an actuator(s), and a distributed controller for controlling the movement of each of the robotic joints. The controller includes a visual perception module (VPM) for visually identifying and tracking an object in the field of view of the robot under threshold lighting conditions. The VPM includes optical devices for collecting an image of the object, a positional extraction device, and a host machine having an algorithm for processing the image and positional information. The algorithm visually identifies and tracks the object, and automatically adapts an exposure time of the optical devices to prevent feature data loss of the image under the threshold lighting conditions. A method of identifying and tracking the object includes collecting the image, extracting positional information of the object, and automatically adapting the exposure time to thereby prevent feature data loss of the image.
    Type: Application
    Filed: September 22, 2009
    Publication date: March 24, 2011
    Applicants: GM GLOBAL TECHNOLOGY OPERATIONS, INC., The U.S.A. As Represented by the Administrator of the National Aeronautics and Space Administration, HRL Laboratories, LLC
    Inventors: James W. Wells, Neil David Mc Kay, Suhas E. Chelian, Douglas Martin Linn, Charles W. Wampler, II, Lyndon Bridgwater
  • Publication number: 20110070960
    Abstract: The present invention provides a method and apparatus for accurately positioning a robotic pool-playing device. The system comprises a computer controlled robotic positioning device, such as a gantry robot, that can position a cue over the pool table and place a shot. A global camera is mounted on the ceiling looking down at the table, and the acquired images are transmitted to the computer for analysis to determine the identity and locations of the balls within the table coordinate reference frame. The computer also automatically determines which ball to strike. An aspect of the invention is the use of a local camera, mounted on or near the robotic end-effector in a fixed relationship with the cue, to improve the positioning error of the robotic device prior to placing a shot.
    Type: Application
    Filed: October 1, 2010
    Publication date: March 24, 2011
    Inventor: Michael GREENSPAN
  • Patent number: 7909551
    Abstract: An apparatus and method for shaving the inside of barrels. The invention relates to reconditioning used wine barrels by shaving the inside surface to a predetermined depth, ready for re-crozering, toasting, and re-use. Conventional shaving methods typically involve routing the internal surface by hand, but this technique is problematic in that it is a very slow process, the quality of the wood is often adversely affected, and there is no way of ensuring that the surface will be shaved to the same depth across the entire surface. The resultant internal dimensions of the barrel are not reflective, relatively, of the original barrel surface. The invention includes a scanning device to scan the internal dimensions of the barrel, and a cutting device to shave the internal surface to a predetermined depth relative to the scanned internal dimensions and where the scanning device and the cutting device may be supported by different robotic arms.
    Type: Grant
    Filed: July 6, 2010
    Date of Patent: March 22, 2011
    Assignee: Southern Cross Cooperage Pty Ltd
    Inventor: Breck Waterman
  • Publication number: 20110063417
    Abstract: A method and system to enable a computer to estimate calibration parameters autonomously so that accurate stereopsis can be performed. The present invention automatically calibrates two or more cameras with unknown parameters with respect to a robot or robotic appendage (e.g., articulated robot arm) with a light source that can be turned on and off at one end. A pair of cameras (e.g., digital cameras) are positioned and aimed so as to give stereoptic coverage of the robot's workspace. The procedure determines the positions and orientations of the pair of cameras with respect to a robot (i.e., exterior orientations) and the focal lengths, optical centers, and distortion coefficients of each camera (i.e., intrinsic parameters) automatically from a set of known positions of the robot arm, and a set of images from the right and left cameras of the robot arm in each position as the light is turned on and off.
    Type: Application
    Filed: July 19, 2010
    Publication date: March 17, 2011
    Inventors: RICHARD ALAN PETERS, II, Shawn Atkins, Dae Jin Kim, Aditya Nawab
  • Patent number: 7907796
    Abstract: A method for aligning a tape cartridge accessor with cartridges in cells of a tape cartridge magazine is provided. IR illumination is applied to an object expected to include a desired physical feature. Specular reflections are received from the illuminated object to create an image of the object. Dynamic image thresholding is applied to the image to select an optimum gray scale level of the image expected to include the desired physical feature. Bounding boxes are used to identify the location of the desired physical feature in the thresholded image.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: March 15, 2011
    Assignee: International Business Machines Corporation
    Inventor: Lyle Joseph Chamberlain
  • Publication number: 20110050841
    Abstract: A tele-presence system that includes a portable robot face coupled to a remote station. The robot face includes a robot monitor, a robot camera, a robot speaker and a robot microphone. The remote station includes a station monitor, a station camera, a station speaker and a station microphone. The portable robot face can be attached to a platform mounted to the ceiling of an ambulance. The portable robot face can be used by a physician at the remote station to provide remote medical consultation. When the patient is moved from the ambulance the portable robot face can be detached from the platform and moved with the patient.
    Type: Application
    Filed: August 26, 2009
    Publication date: March 3, 2011
    Inventors: Yulun Wang, Charles S. Jordan, Marco Pinter, Daniel Steven Sanchez, Kevin Hanrahan
  • Publication number: 20110047109
    Abstract: A mobile brain-based device (BBD) includes a mobile platform with sensors and effectors, which is guided by a simulated nervous system that is an analogue of the cerebellar areas of the brain used for predictive motor control to determine interaction with a real-world environment. The simulated nervous system has neural areas including precerebellum nuclei (PN), Purkinje cells (PC), deep cerebellar nuclei (DCN) and an inferior olive (IO) for predicting turn and velocity control of the BBD during movement in a real-world environment. The BBD undergoes training and testing, and the simulated nervous system learns and performs control functions, based on a delayed eligibility trace learning rule.
    Type: Application
    Filed: October 28, 2010
    Publication date: February 24, 2011
    Applicant: NEUROSCIENCES RESEARCH FOUNDATION, INC.
    Inventors: Jeffrey L. McKinstry, Gerald M. Edelman, Jeffrey L. Krichmar
  • Publication number: 20110046785
    Abstract: Method and device for the removal of a part of a crop, such as a leaf (14). To this end, the crop is approached from a low position with vision techniques and the stem (12) and the parts protruding therefrom are observed from beneath. Based upon the number of images observed, an arm is controlled and moved towards the relevant stalk (13). This movement is primarily parallel to the stalk and is performed from a low proximity position. When the stalk (13) is approached, the stalk (13) is positioned within an opening between two rotating parts (3). The stalk is grasped by way of rotation and the stalk is moved in respect of the arm so that the cutting point of the stalk is manipulated towards the arm. The stalk (13) is subsequently cut through and the leaf is disposed of.
    Type: Application
    Filed: April 14, 2009
    Publication date: February 24, 2011
    Inventor: Ronald Zeelen
  • Publication number: 20110033254
    Abstract: An apparatus and method for performing manufacturing operations using position sensing for robotic arms that efficiently and accurately finds the location of a workpiece or features on a workpiece, with minimal need for adjustments.
    Type: Application
    Filed: August 6, 2009
    Publication date: February 10, 2011
    Applicant: KMT ROBOTIC SOLUTIONS, INC.
    Inventor: Charles A. Abrams
  • Publication number: 20110035054
    Abstract: The invention is a system that is integrated with an existing robotic system in order to extend its observation, surveillance, and navigational capabilities. The system comprises: a sensor module comprising imaging and other types of sensors that is attached to the robotic device of the robotic system and a system control station comprising a communication link to the robot control station of the existing robotic system. Both the system control station and the sensor module comprise processing units that are configured to work in complete harmony. These processing units are each supplied with software that enables using information supplied by the sensors and other components in the sensor module to provide the robotic systems with many advanced capabilities that could not be achieves prior to attachment of the sensor module to the robot.
    Type: Application
    Filed: August 7, 2008
    Publication date: February 10, 2011
    Applicants: WAVE GROUP LTD., O.D.F. OPTRONICS LTD.
    Inventors: Ehud Gal, Gennadiy Berinsky, Yosi Wolf
  • Publication number: 20110029131
    Abstract: A measurement apparatus for determining a position of a tool center point (31) of a tool (30), which is attached to a tool attachment surface (32) of a robot (1), with respect to the tool attachment surface (32) includes: a camera (4) attached to the arm tip portion of the robot (1); a touch-up point (an origin of ?m) disposed in a working space of the robot; a measurement section (11a) for measuring the position of the touch-up point by using the robot and the camera; a first storage section (12a) for storing the measured position of the touch-up point; a second storage section (12b) for storing a position of the robot (1) when the tool center point is aligned with the touch-up point by moving the robot; and a calculation section (11b) for calculating the position of the tool center point with respect to the tool attachment surface of the robot by using the stored positions of the touch-up point and the robot.
    Type: Application
    Filed: July 16, 2010
    Publication date: February 3, 2011
    Applicant: FANUC LTD
    Inventors: Kazunori BAN, Katsutoshi Takizawa, Gang Shen
  • Publication number: 20110029132
    Abstract: A system for calibrating a robotic tool includes a housing including an aperture for receiving the robotic tool, an image generating device disposed in the housing and positioned to generate an image of the robotic tool received through the aperture of the housing, wherein the image generating device generates an image signal representing the image of the robotic tool, a light source disposed in the housing to backlight the robotic tool received through the aperture of the housing, and a processor responsive to the image signal for calculating and monitoring a configuration of the robotic tool.
    Type: Application
    Filed: August 2, 2010
    Publication date: February 3, 2011
    Inventors: Thomas Nemmers, Donald E. Jenkins, Terry L. Tupper
  • Publication number: 20110022231
    Abstract: Automated apparatuses and related methods for scanning, spraying, pruning, and harvesting crops from plant canopies. The apparatuses include a support structure comprising a frame, a central vertical shaft, and at least one module support member capable of rotating around a plant canopy. The support member supports a plurality of movable arms, each arm having at least one detector for probing the plant canopy. Embodiments further comprise applicators and/or manipulators for spraying, pruning, and harvesting crops from within the plant canopy. The methods of the present invention include causing the moveable arms attached to the support structure to be extended into the plant canopy, searching for crops, and transmitting and/or storing the search data. Embodiments further comprise detaching crops from the plant canopy and transporting them to a receptacle, applying a controlled amount of material within the plant canopy, or pruning inside of the plant canopy.
    Type: Application
    Filed: July 26, 2010
    Publication date: January 27, 2011
    Inventors: Jeffrey WALKER, Lev DRUBETSKY
  • Publication number: 20110022217
    Abstract: This object aims to provide a work mounting system which has an improved usability and can be miniaturized. A work mounting system (1) is used to mount a sunroof member (3) on the inner panel (2A) of a body (2). The work mounting system (1) comprises a conveying robot (4) for holding and conveying the sunroof member (3), a mounting robot (5) with a nut runner for tightening bolts and a CCD camera, and a controller (6) for controlling the conveying robot (4) and the mounting robot (5).
    Type: Application
    Filed: February 25, 2009
    Publication date: January 27, 2011
    Applicant: HONDA MOTOR CO. LTD.
    Inventor: Kenichi Asamizu
  • Patent number: 7874386
    Abstract: A hybrid mobile robot includes a base link and a second link. The base link has a drive system and is adapted to function as a traction device and a turret. The second link is attached to the base link at a first joint. The second link has a drive system and is adapted to function as a traction device and to be deployed for manipulation. In another embodiment an invertible robot includes at least one base link and a second link. In another embodiment a mobile robot includes a chassis and a track drive pulley system including a tension and suspension mechanism. In another embodiment a mobile robot includes a wireless communication system.
    Type: Grant
    Filed: October 31, 2007
    Date of Patent: January 25, 2011
    Assignee: Pinhas Ben-Tzvi
    Inventors: Pinhas Ben-Tzvi, Andrew A. Goldenberg, Jean W. Zu
  • Publication number: 20110001813
    Abstract: Provided is a gesture recognition apparatus. The gesture recognition apparatus includes a human detection unit, a gesture region setting region, an arm detection unit and a gesture determination unit. The human detection unit detects a face region of a user from an input image. The gesture region setting unit sets a gesture region, in which a gesture of the user's arm occurs, with respect to the detected face region. The arm detection unit detects an arm region of the user in the gesture region. The gesture determination unit analyzes a position, moving directionality and shape information of the arm region in the gesture region to determine a target gesture of the user. Such a gesture recognition apparatus may be used as a useful means for a human-robot interaction in a long distance where a robot has difficulty in recognizing a user's voice.
    Type: Application
    Filed: December 15, 2009
    Publication date: January 6, 2011
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Do Hyung KIM, Jae Yeon Lee, Woo Han Yun, Su Young Chi, Ho Sub Yoon, Hye Jin Kim, Young Woo Yoon
  • Publication number: 20110004343
    Abstract: A position control method for controlling a position of a movable portion, includes: performing control of allowing the movable portion to approach a predetermined position by moving the movable portion; and performing control of moving the movable portion to the predetermined position by moving the movable portion and detecting a relative position of the movable portion with respect to the predetermined position by using an imaging unit.
    Type: Application
    Filed: July 1, 2010
    Publication date: January 6, 2011
    Applicant: SEIKO EPSON CORPORATION
    Inventor: Izumi IIDA
  • Publication number: 20110004341
    Abstract: A robot using less storage and computational resources to embody panoramic attention. The robot includes a panoramic attention module with multiple levels that are hierarchically structured to process different levels of information. The top-level of the panoramic attention module receives information about entities detected from the environment of the robot and maps the entities to a panoramic map maintained by the robot. By mapping and storing high-level entity information instead of low-level sensory information in the panoramic map, the amount of storage and computation resources for panoramic attention can be reduced significantly. Further, the mapping and storing of high-level entity information in the panoramic map also facilitates consistent and logical processing of different conceptual levels of information.
    Type: Application
    Filed: June 18, 2010
    Publication date: January 6, 2011
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Ravi Kiran Sarvadevabhatla, Victor Ng-Thow-Hing
  • Publication number: 20100332016
    Abstract: A system and method for trimming flash from the body of a workpiece. The system uses a laser system and a vision system to determine quickly a cut line for removing flash from the workpiece. The method includes the steps of projecting a line of light onto the workpiece that crosses the flash and the body and determining the profile of the line in an image obtained by a vision system. The system and method are capable of dynamically trimming while determining where to cut the flash from the body of the workpiece.
    Type: Application
    Filed: June 24, 2010
    Publication date: December 30, 2010
    Inventors: Charles A. Abrams, Jerry Kuhn, Mark W. Handelsman
  • Publication number: 20100329832
    Abstract: A housing including a base and cover has an open space that is open to the atmosphere and a closed space that is closed to the atmosphere. The open space contains a laser having a low operating temperature. The closed space contains a heat generating element and the like, which have higher operating temperatures than the laser. The heat generating element is in close contact with the base, which also serves as a heatsink, so that the heat generating element is cooled. Most parts of a motor, which is a heat generating member, are disposed in the closed space. A fan is diagonally disposed with respect to side surfaces of the heat generating element, which has a rectangular shape and flat side surfaces, so as to efficiently blow air toward the heat generating element disposed in the closed space.
    Type: Application
    Filed: September 2, 2010
    Publication date: December 30, 2010
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Mikio OSHIMA, Mitsuhiro Matsuzaki, Hiroyuki Maezawa
  • Publication number: 20100324735
    Abstract: The present invention relates to a method and a device for the machining of an object using a tool, in which the tool (2) or the object (18) is guided using a handling apparatus, which has multiple movement axes for the coarse positioning of the tool (2) or object (18), which form a kinematic chain. In the method, an additional actuator (3), which has a higher positioning precision in at least one dimension or axis than the other movement axes, is inserted between a terminal link (1) of the kinematic chain and the tool (2) or object (18). A relative movement of the tool (2) or terminal link (1) of the kinematic chain to the object (18) is detected using at least one sensor (5) and a deviation from a target movement path is compensated for using the additional actuator (3). The method and the associated device allow the use of robots or other handling apparatuses having lower path precision for applications which require a high precision during the guiding of the tool.
    Type: Application
    Filed: October 18, 2007
    Publication date: December 23, 2010
    Applicants: FRAUNHOFER-GESELLSCHAFT ZUR FOERDERUNG DERANGEWAND, RHEINISCH-WESTFAELISCHE TECHNISCHE
    Inventors: Boris Regaard, Stefan Kaierle
  • Publication number: 20100324736
    Abstract: A robot cleaner system is described including a docking station to form a docking area within a predetermined angle range of a front side thereof, to form docking guide areas which do not overlap each other on the left and right sides of the docking area, and to transmit a docking guide signal such that the docking guide areas are distinguished as a first docking guide area and a second docking guide area according to an arrival distance of the docking guide signal. The robot cleaner system also includes a robot cleaner to move to the docking area along a boundary between the first docking guide area and the second docking guide area when the docking guide signal is sensed and to move along the docking area so as to perform docking when reaching the docking area.
    Type: Application
    Filed: June 15, 2010
    Publication date: December 23, 2010
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Kyung Hwan Yoo, Jae Man Joo, Dong Won Kim, Jun Hwa Lee, Jun Pyo Hong, Woo Ram Chung, Jae Young Jung, Hwi Chan Jang, Jang Youn Ko, Jeong Gon Song, Sam Jong Jeung
  • Patent number: 7856291
    Abstract: A cleaning robot may be provided having a case and a sensor assembly. The sensor assembly may include a sensor hole having a first opening provided at an outer surface of the case and a second opening provided inwardly of the first opening, with respect to a center of the case. Additionally, the sensor assembly may include a sensor element configured to receive a signal and the sensor element may be provided inwardly of the first opening.
    Type: Grant
    Filed: September 17, 2007
    Date of Patent: December 21, 2010
    Assignee: LG Electronics Inc.
    Inventors: Yong Gyu Jung, Hyeong Shin Jeon
  • Publication number: 20100316334
    Abstract: This invention discloses highly scalable and modular automated optical cross connect switch devices which exhibit low loss and scalability to high port counts. In particular, a device for the programmable interconnection of large numbers of optical fibers (100's-1000's) is provided, whereby a two-dimensional array of fiber optic connections is mapped in an ordered and rule-based fashion into a one-dimensional array with tensioned fiber optic circuit elements tracing substantially straight lines there between. Fiber optic elements are terminated in a stacked arrangement of flexible fiber optic circuit elements with a capacity to retain excess fiber lengths while maintaining an adequate bend radius. The combination of these elements partitions the switch volume into multiple independent, non-interfering zones, which retain their independence for arbitrary and unlimited numbers of reconfigurations.
    Type: Application
    Filed: August 21, 2008
    Publication date: December 16, 2010
    Inventor: Anthony Kewitsch
  • Publication number: 20100312386
    Abstract: Functionality is described for probabilistically determining the location of an agent within an environment. The functionality performs this task using a topological representation of the environment provided by a directed graph. Nodes in the directed graph represent locations in the environment, while edges represent transition paths between the locations. The functionality also provides a mechanism by which the agent can navigate in the environment based on its probabilistic assessment of location. Such a mechanism can use a high-level control module and a low-level control module. The high-level control module determines an action for the agent to take by considering a plurality of votes associated with different locations in the directed graph. The low-level control module allows the agent to navigate along a selected edge when the high-level control module votes for a navigation action.
    Type: Application
    Filed: June 4, 2009
    Publication date: December 9, 2010
    Applicant: Microsoft Corporation
    Inventors: Georgios Chrysanthakopoulos, Guy Shani
  • Publication number: 20100305755
    Abstract: The invention relates to a vision-based attention system, comprising: at least one vision sensor, at least one image processing module processing an output signal of the vision sensor in order to generate at least one two-dimensional feature map, a dorsal attention subsystem generating a first saliency map on the basis of the at least one feature map, the saliency map indicating a first focus of attention for the driver assistance system, a ventral attention subsystem, independent to the dorsal attention subsystem, for generating a second saliency map on the basis of at least one feature map, which can be the same as the one used for the dorsal attention system or a different one, the second saliency map indicating unexpected visual stimuli.
    Type: Application
    Filed: April 20, 2010
    Publication date: December 2, 2010
    Applicant: HONDA RESEARCH INSTITUTE EUROPE GMBH
    Inventor: Martin HERACLES
  • Publication number: 20100305747
    Abstract: An improved method and apparatus for extracting and handling samples for S/TEM analysis. Preferred embodiments of the present invention make use of a micromanipulator and a hollow microprobe probe using vacuum pressure to adhere the microprobe tip to the sample. By applying a small vacuum pressure to the lamella through the microprobe tip, the lamella can be held more securely and its placement controlled more accurately than by using electrostatic force alone. By using a probe having a beveled tip and which can also be rotated around its long axis, the extracted sample can be placed down flat on a sample holder.
    Type: Application
    Filed: October 20, 2007
    Publication date: December 2, 2010
    Applicant: FEI COMPANY
    Inventors: Enrique Agorio, James Edgar Hudson, Gerhard Daniel, Michael Tanguay, Jason Arjavac
  • Publication number: 20100298978
    Abstract: Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as a object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where end effector can move.
    Type: Application
    Filed: May 12, 2010
    Publication date: November 25, 2010
    Applicant: CANON KABUSHIKI KAISHA
    Inventor: Kota Tani
  • Patent number: 7840369
    Abstract: An apparatus correcting a bias of a gyroscope that is mounted on a mobile robot and that measures an angular velocity of the mobile robot. The apparatus includes: at least one encoder respectively measuring a traveling velocity of a respective at least one wheel of the mobile robot; a modeling unit calculating an angular velocity of the mobile robot by using the measured traveling velocity; a bias presuming unit determining a confidence range by using difference values between the calculated angular velocity and the measured angular velocity, and calculating a presumed bias by using a value in a confidence range among the difference values; and a bias removing unit removing the presumed bias from the measured angular velocity.
    Type: Grant
    Filed: July 5, 2007
    Date of Patent: November 23, 2010
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Hyoung-ki Lee, Ki-wan Choi, Dong-yoon Kim, Seok-won Bang
  • Patent number: 7840309
    Abstract: When a new desired gait of a robot is generated, it is determined, on the assumption that the trajectory of an acting force between the robot and an object at a predetermined time point in the future changes to a trajectory different from a desired trajectory, whether a predetermined dynamical restrictive condition can be satisfied when a desired gait after the predetermined time point is generated. If the condition cannot be satisfied, then a moving schedule for the object is corrected, the desired trajectory or the like of the acting force between the robot and the object is re-determined, and a new desired gait is generated using the re-determined desired trajectory. With this arrangement, the gait of the robot to cause the robot to perform an operation for moving the object is generated such that the stability of the posture of the robot can be secured even if an acting force between the robot and the object in the future deviates from a desired value.
    Type: Grant
    Filed: September 28, 2005
    Date of Patent: November 23, 2010
    Assignee: Honda Motor Co., Ltd.
    Inventor: Tadaaki Hasegawa
  • Publication number: 20100286827
    Abstract: The invention relates to a method for processing video signals from a video sensor, in order to extract 3d shape information about objects represented in the video signals, the method comprising the following steps: providing a memory in which objects are stored in a 3d shape space, the shape space being an abstract feature space encoding the objects' 3d shape properties, and mapping a 2d video signal representation of an object in the shape space, the coordinates of the object in the shape space indicating the object's 3d shape.
    Type: Application
    Filed: May 5, 2010
    Publication date: November 11, 2010
    Applicant: HONDA RESEARCH INSTITUTE EUROPE GMBH
    Inventors: Mathias FRANZIUS, Heiko WERSING
  • Publication number: 20100281822
    Abstract: A load smart system for continuously loading preformed pouches into a fill-seal machine is provided. A plurality of pouches are disposed within a pouch delivery device, each of the pouches having an upper edge and an indicia. The fill-seal machine includes a rotating turret having a plurality of radially extending grippers. The rotating turret rotates the plurality of gripper pairs between a loading station, an opening station, a filling station, a sealing station, and an unloading station. A robotic transfer device is positioned between the pouch delivery device and the loading station of the fill-seal machine. The robotic transfer device includes an optical sensor positioned on a gripper member. During operation, the optical sensor scans the indicia to determine the pouch characteristics and a controlling station controls the gripper member to deposits the pouch within the gripper pairs at the loading station a predetermined distance from the upper edge of the pouch.
    Type: Application
    Filed: July 21, 2010
    Publication date: November 11, 2010
    Applicant: Pouch Pac Innovations, LLC
    Inventor: R. Charles Murray
  • Patent number: 7831337
    Abstract: The present invention provides a method and apparatus for accurately positioning a robotic pool-playing device. The system comprises a computer controlled robotic positioning device, such as a gantry robot, that can position a cue over the pool table and place a shot. A global camera is mounted on the ceiling looking down at the table, and the acquired images are transmitted to the computer for analysis to determine the identity and locations of the balls within the table coordinate reference frame. The computer also automatically determines which ball to strike. An aspect of the invention is the use of a local camera, mounted on or near the robotic end-effector in a fixed relationship with the cue, to improve the positioning error of the robotic device prior to placing a shot.
    Type: Grant
    Filed: September 23, 2005
    Date of Patent: November 9, 2010
    Inventor: Michael Greenspan
  • Publication number: 20100274389
    Abstract: The invention relates to a medical device, a medical work station, and a method for registering an object (P). The medical device comprises a navigation system (17, 18) and a robot (R) having several axes of rotation (1-6). The navigation system (17, 18) comprises a detection device (18) for detecting prominent points on an object (P) or markers (M) placed on the object (P) as well as a processing device (17) for determining the position of the object (P) on the basis of the prominent points or markers (M) detected by means of the detection device (18). The detection device (18) of the navigation system is mounted on the robot (R).
    Type: Application
    Filed: November 18, 2008
    Publication date: October 28, 2010
    Applicant: KUKA ROBOTER GMBH
    Inventors: Tobias Ortmaier, Dirk Jacob, Thomas Neff
  • Publication number: 20100271540
    Abstract: Various embodiments of an eyeball device for use in robots are provided. In one embodiment, the eyeball device has an eyeball, a main part having a camera, a frame having first and second sections, first and second actuators, a first connecting member and a second connecting member. The first section is horizontally pivotally coupled to the main part relative to the main part, while the second section vertically pivotally supports the eyeball. The first and second actuators produce rotary motions. The first connecting member is coupled to the first section and the first actuator to connecting the rotary motion of the first actuator to the first section. The second connecting member is coupled to the eyeball and the second actuator to transmit the rotary motion of the second actuator to the eyeball. The eyeball horizontally pivots through the first connecting member and vertically pivots through the second connecting member.
    Type: Application
    Filed: April 26, 2010
    Publication date: October 28, 2010
    Inventors: Kyung Geune Oh, Seung-Jong Kim, Myoung Soo Jang, Chan Yul Jung
  • Publication number: 20100274387
    Abstract: A robotic mapping method includes scanning a robot across a surface to be mapped. Locations of a plurality of points on the surface are sensed during the scanning. A first of the sensed point locations is selected. A preceding subset of the sensed point locations is determined. The preceding subset is disposed before the first sensed point location along a path of the scanning. A following subset of the sensed point locations is determined. The following subset is disposed after the first sensed point location along the path of the scanning. The first sensed point location is represented in a map of the surface by an adjusted first sensed point location. The adjusted first sensed point location is closer to each of the preceding and following subsets of the sensed point locations than is the first sensed point location.
    Type: Application
    Filed: April 24, 2009
    Publication date: October 28, 2010
    Applicant: Robert Bosch GmbH
    Inventor: Benjamin Pitzer
  • Publication number: 20100272347
    Abstract: A method for performing DA (Dynamic Alignment) beam calibration in a plasma processing system is provided. The method including acquiring a positional difference, the positional difference is acquired using an optical imaging approach. The optical imaging approach comprising of positioning the wafer on the end effector, taking a still image of the wafer on the end effector, processing the still image to ascertain the center of the wafer and an end effector-defined center defined by the end effector, and determining the positional difference between the center of the wafer and the end effector-defined center defined by the end effector. The method also includes centering a wafer with respect to an end effector by compensating for a positional difference between the wafer and the end effector with robot movement compensation. The method including moving the wafer and the end effector through DA beams associated with a plasma processing module.
    Type: Application
    Filed: December 19, 2008
    Publication date: October 28, 2010
    Inventors: Matt Rodnick, Christine Allen-Blanchette
  • Publication number: 20100268383
    Abstract: A remote control station that accesses one of at least two different robots that each have at least one unique robot feature. The remote control station receives information that identifies the robot feature of the accessed robot. The remote station displays a display user interface that includes at least one field that corresponds to the robot feature of the accessed robot. The robot may have a laser pointer and/or a projector.
    Type: Application
    Filed: April 17, 2009
    Publication date: October 21, 2010
    Inventors: Yulun Wang, Marco Pinter, Kevin Hanrahan, Daniel Steven Sanchez, Charles S. Jordan, David Bjorn Roe, James Rosenthal, Derek Walters
  • Patent number: 7818091
    Abstract: A process and a device are provided for determining the pose as the entirety of the position and the orientation of an image reception device. The process is characterized in that the pose of the image reception device is determined with the use of at least one measuring device that is part of a robot. The device is characterized by a robot with an integrated measuring device that is part of the robot for determining the pose of the image reception device.
    Type: Grant
    Filed: September 28, 2004
    Date of Patent: October 19, 2010
    Assignee: Kuka Roboter GmbH
    Inventors: Arif Kazi, Rainer Bischoff
  • Publication number: 20100245558
    Abstract: A component manipulating method includes recognizing, computing, and manipulating. The recognizing is a process in which a position and an attitude of a measured object is recognized by taking an image of a light spot group of the measured object with a camera, the measured object having the light spot group including a plurality of light spots, based on a light image expressing light spots constituting the light spot group on an image taken with the camera. The computing is a process in which a position and an attitude of the component are computed based on the position and the attitude of the recognized measured object and also on geometric arrangement positions of the measured object and the component. The manipulating is a process in which a robot being used to perform operations on the component is manipulated based on the computed position and the attitude.
    Type: Application
    Filed: August 17, 2009
    Publication date: September 30, 2010
    Inventors: Naoki Koike, Taketoshi Furuki
  • Patent number: 7801645
    Abstract: A robot uses an infrared sensor including an infrared light source which produces pulses of infrared light. Optics focus reflections of the infrared light pulses from different portions of the environment of the robot to different detectors in a 2D array of detectors. The detectors produce an indication of the distance to the closest object in an associated portion of the environment. The robot can use the indications to determine features in the environment. The robot can be controlled to avoid these features.
    Type: Grant
    Filed: March 11, 2004
    Date of Patent: September 21, 2010
    Assignee: Sharper Image Acquisition LLC
    Inventors: Charles E. Taylor, Andrew J. Parker, Shek Fai Lau, Eric C. Blair, Andrew Heninger, Eric Ng, Patricia I. Brenner