Optical Patents (Class 901/47)
  • Publication number: 20090061099
    Abstract: A robotic spray system is provided for accurately spraying mold release onto any size or shaped green tire. The system analyzes individual green tires using an integrated vision system. The system controls the robotic spray position, the fan, fluid, atomizing air, and tire rotation speed for optimal spray coverage on both the inside and outside of green tires. The system includes a conveyor, an overhead mounted camera located over an infeed station, and a second camera located perpendicular to the green tire's tread and several feet away from the center of the tire. Pictures of the green tire in the station are used to estimate the center and radius of the tire and locate the angle of the bar code with respect to the center of the tire. Reference points are provided from the camera images and robot positions are calculated to control the spraying.
    Type: Application
    Filed: September 4, 2007
    Publication date: March 5, 2009
    Inventor: Todd E. Hendricks, SR.
  • Publication number: 20090062960
    Abstract: Described herein is a method and system for performing calibrations on robotic components. In one embodiment, a method for performing robotic calibrations includes manually calibrating a center of a robot blade aligned with respect to a target. The method further includes recording a first positional value of the center of the robot blade aligned with respect to a camera. The method further includes automatically determining a second positional value of the center of the robot blade aligned with respect to the camera. The method further includes automatically recalibrating the robot blade based on an offset between the second positional value and the first positional value exceeding a tolerance offset from the first positional value.
    Type: Application
    Filed: August 25, 2008
    Publication date: March 5, 2009
    Inventors: Sekar Krishnasamy, Vijay Sakhare, Mordechai Leska, Donald Foldenauer, Rinat Shimshi
  • Publication number: 20090055020
    Abstract: Provided is an apparatus, method, and medium for allowing a mobile robot to simultaneously perform a cleaning process and a map-creating process. The apparatus includes a feature-map-creating unit creating a feature map for recognizing the position of a mobile robot; a path-map-creating unit creating a path map including a plurality of cells, each having information about whether an obstacle exists and path information, on the basis of information on the pose of the mobile robot that is obtained from the feature map; and a motion-control unit moving the mobile robot on the basis of the information about whether the obstacle exists and the path information.
    Type: Application
    Filed: June 27, 2008
    Publication date: February 26, 2009
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Woo-yeon Jeong, Su-Jin Lee, Hyeon Myeong, Seok-Won Bang
  • Publication number: 20090055024
    Abstract: A robotic arm and control system includes a robotic arm which moves in response to one or more command signals. One or more “active” fiducials are located on the arm, each of which emits its own light. A 3D camera having an associated field-of-view is positioned such that at least one fiducial and a target object to be manipulated are in the FOV. To determine their spatial positions, the arm fiducials are activated and the target object is preferably illuminated with a scanning laser; the camera produces output signals which vary with the spatial locations of the fiducials and target object. A controller receives the output signals and uses the spatial position information as feedback to continuously guide the arm towards the target object. Multiple active fiducials may be employed, each having respective characteristics with which they can be differentiated.
    Type: Application
    Filed: November 13, 2007
    Publication date: February 26, 2009
    Inventor: Robert L. Kay
  • Publication number: 20090055023
    Abstract: A remote controlled robot system that includes a robot and a remote controlled station. The robot includes a camera and a printer coupled to a mobile platform. The remote control station may display one or more graphical user interfaces with data fields. The graphical user interfaces allow a user to enter information into the data fields. The information is then transmitted to the robot and printed by the robot printer. The information may include a medical prescription and the name of the patient. Providing a robot printer allows the user to directly provide a medical prescription while remotely observing and interacting with the patient.
    Type: Application
    Filed: August 23, 2007
    Publication date: February 26, 2009
    Inventors: Derek Walters, Marco Pinter, Jonathan Southard, Charles S. Jordan
  • Publication number: 20090052740
    Abstract: A moving object detecting device measures a congestion degree of a space and utilizes the congestion degree for tracking. In performing the tracking, a direction measured by a laser range sensor is heavily weighted when the congestion degree is low. When the congestion degree is high, a sensor fusion is performed by heavily weighting a direction measured by a image processing on a captured image to obtain a moving object estimating direction, and obtains a distance by the laser range sensor in the moving object estimating direction.
    Type: Application
    Filed: August 22, 2008
    Publication date: February 26, 2009
    Applicant: KABUSHIKI KAISHA TOSHIBA
    Inventor: Takafumi Sonoura
  • Publication number: 20090055019
    Abstract: An interactive system for interacting with a sentient being. The system includes a robotic companion of which the sentient being may be a user and an entity which employs the robot as a participant in an activity involving the user. The robotic companion responds to inputs from an environment that includes the user during the activity. The robotic companion is capable of social and affective behavior either under control of the entity or in response to the environment. The entity may provide an interface by which an operator may control the robotic companion. Example applications for the interactive system include as a system for communicating with patients that have difficulties communicating verbally, a system for teaching remotely-located students or students with communication difficulties, a system for facilitating social interaction between a remotely-located relative and a child, and systems in which the user and the robot interact with an entity such as a smart book.
    Type: Application
    Filed: May 8, 2008
    Publication date: February 26, 2009
    Applicant: Massachusetts Institute of Technology
    Inventors: Walter Dan Stiehl, Cynthia Breazeal, Jun Ki Lee, Allan Z. Maymin, Heather Knight, Robert L. Toscano, Iris M. Cheung
  • Publication number: 20090035107
    Abstract: A tire rotating robot used to rotate the tires of a vehicle. The robot can remove two tires sequentially without having a human to manually lift the tires. The robot for rotating tires comprises of a mobile base, a body connected to the base, a pivotally mounted two-position rotating beam connected to the body, two arm guide assemblies displaced within a channel of the beam, a motor that powers the robot, and an interface that controls the robot.
    Type: Application
    Filed: August 2, 2007
    Publication date: February 5, 2009
    Inventors: Marlene Duran, Michael Lopez
  • Publication number: 20090030551
    Abstract: A method and a system for operating a mobile robot comprise a range finder for collecting range data of one or more objects in an environment around the robot. A discriminator identifies uniquely identifiable ones of the objects as navigation landmarks. A data storage device stores a reference map of the navigation landmarks based on the collected range data. A data processor establishes a list or sequence of way points for the robot to visit. Each way point is defined with reference to one or more landmarks. A reader reads an optical message at or near one or more way points. A task manager manages a task based on the read optical message.
    Type: Application
    Filed: October 9, 2007
    Publication date: January 29, 2009
    Inventors: Thomas Kent Hein, Karl-Heinz O. Mertins, Daniel W. Mairet
  • Publication number: 20090028542
    Abstract: An outer body of ball shell type has an opening. A camera is located in the outer body and receives an image from outside of the outer body through the opening. A camera support unit is located in the outer body and rotationally supports the camera along a first axis and a second axis mutually crossed at a center of the outer body. A first camera actuator is located in the outer body and rotationally actuates the camera around the first axis. A second camera actuator is located in the outer body and rotationally actuates the camera around the second axis.
    Type: Application
    Filed: September 25, 2008
    Publication date: January 29, 2009
    Applicant: KABUSHIKI KAISHA TOSHIBA
    Inventors: Hideichi NAKAMOTO, Junko HIROKAWA, Takashi ICHIKAWA, Hideki ITO, Hideki OGAWA, Nobutaka KIKUIRI
  • Publication number: 20090024251
    Abstract: A method and apparatus for estimating the pose of a mobile robot using a particle filter is provided. The apparatus includes an odometer which detects a variation in the pose of a mobile robot, a feature-processing module which extracts at least one feature from an upward image captured by the mobile robot, and a particle filter module which determines current poses and weights of a plurality of particles by applying the mobile robot pose variation detected by the odometer and the feature extracted by the feature-processing module to previous poses and weights of the particles.
    Type: Application
    Filed: June 5, 2008
    Publication date: January 22, 2009
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Hyeon Myeong, Woo-Yeon Jeong, Seok-Won Bang
  • Patent number: 7474781
    Abstract: An image based bar-code reading and robotic registration apparatus and method for use in automated tape library systems is disclosed. An imager is positioned on a picker assembly with its own illumination source and appropriate optics to filter out ambient light. The imager connects to a microprocessor in its immediate vicinity. All image acquisition and processing are done by these components. To ensure operation independent of illumination variations, the image processing code developed for this invention automatically adapts to dynamic lighting situations. The tape cartridge cells are used as fiducials to allow tape cartridge registration without fiducial markings. The use of the tape cartridge cells as fiducials maximizes storage capability.
    Type: Grant
    Filed: September 20, 2001
    Date of Patent: January 6, 2009
    Assignee: International Business Machines Corporation
    Inventor: Lyle Joseph Chamberlain
  • Publication number: 20080316306
    Abstract: A robot system comprises one or more tools (5, 15), a camera (9) and a light source (13), which can be moved independent of the camera (9), in order to illuminate the field of vision of the camera from different directions.
    Type: Application
    Filed: January 13, 2005
    Publication date: December 25, 2008
    Inventors: Hardy Burkle, Uwe Haas, Claus Lorcher
  • Publication number: 20080312771
    Abstract: A method for controlling a robot having at least one visual sensor. A target for a motion of the robot is defined. A motion control signal adapted for the robot reaching the target is calculated. A collision avoidance control signal based on the closest points of segments of the robot and a virtual object between the visual sensing means and the target is calculated. The motion control signal and the collision avoidance control signal are weighted and combined. The weight of the motion control signal is higher when a calculated collision risk is lower. The motion of the robot is controlled according to the combined signal so that no segment of the robot enters the space defined by the virtual object.
    Type: Application
    Filed: March 21, 2008
    Publication date: December 18, 2008
    Applicant: HONDA RESEARCH INSTITUTE EUROPE GMBH
    Inventor: Hisashi Sugiura
  • Publication number: 20080310705
    Abstract: A robot capable of performing appropriate movement control while reducing arithmetic processing for recognizing the shape of a floor. The robot sets a predetermined landing position of steps of the legs on a present assumed floor, which is a floor represented by floor shape information used for a current motion control of the robot, during movement of the robot. An image projection areas is set, and is projected on each image captured by cameras mounted on the robot for each predetermined landing position in the vicinity of each of the predetermined landing positions. Shape parameters representing the shape of an actual floor partial area are estimated, forming an actual floor whose image is captured in each partial image area, of based on the image of the partial image area generated by projecting the set image projection area on the images captured by the cameras for each partial image area.
    Type: Application
    Filed: March 27, 2008
    Publication date: December 18, 2008
    Applicants: HONDA MOTOR CO., LTD., TOKYO INSTITUTE OF TECHNOLOGY
    Inventors: Minami Asatani, Masatoshi Okutomi, Shigeki Sugimoto
  • Publication number: 20080297590
    Abstract: A robotic vision and vision control system includes a stereoscopic camera mounted on a robot and a remote three-dimensional display system that provides the operator of the robot with a good sense of depth or distance for more precisely maneuvering the robot. The stereoscopic display system can further include an eye-tracking system that monitors the operator's eyes to determine the direction in which the operator's eyes are looking and points the camera in that direction.
    Type: Application
    Filed: May 31, 2007
    Publication date: December 4, 2008
    Inventors: Fred BARBER, Dan Hess
  • Publication number: 20080300723
    Abstract: A teaching position correcting device which can easily correct, with high precision, teaching positions after shifting at least one of a robot and an object worked by the robot. Calibration is carried out using a vision sensor (i.e., CCD camera) that is mounted on a work tool. The vision sensor measures three-dimensional positions of at least three reference marks not aligned in a straight line on the object. The vision sensor is optionally detached from the work tool, and at least one of the robot and the object is shifted. After the shifting, calibration (this can be omitted when the vision sensor is not detached) and measuring of three-dimensional positions of the reference marks are carried out gain. A change in a relative positional relationship between the robot and the object is obtained using the result of measuring three-dimensional positions of the reference marks before and after the shifting respectively.
    Type: Application
    Filed: July 31, 2008
    Publication date: December 4, 2008
    Inventors: Kazunori Ban, Katsutoshi Takizawa
  • Publication number: 20080288108
    Abstract: The invention relates to an objective, such as a projection objective for semiconductor microlithography. The objective can include an optical element that is adjustable by a manipulator unit with an actuator and a sensor. The manipulator unit can be driven by a control system via a data bus. The manipulator unit can have a decentralized control subsystem arranged in the region of the manipulator unit. The control subsystem can be connected to the control system via the data bus.
    Type: Application
    Filed: June 19, 2008
    Publication date: November 20, 2008
    Applicant: CARL ZEISS SMT AG
    Inventor: Torsten Gross
  • Publication number: 20080281470
    Abstract: An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal.
    Type: Application
    Filed: May 9, 2008
    Publication date: November 13, 2008
    Inventors: Duane L. Gilbert, JR., Marcus R. Williams, Andrea M. Okerholm, Elaine H. Kristant, Sheila A. Longo, Daniel E. Kee, Marc D. Strauss
  • Publication number: 20080276408
    Abstract: A surface treatment robot includes a chassis having forward and rear ends and a drive system carried by the chassis. The drive system includes right and left driven wheels and is configured to maneuver the robot over a cleaning surface. The robot includes a vacuum assembly, a collection volume, a supply volume, an applicator, and a wetting element, each carried by the chassis. The wetting element engages the cleaning surface to distribute a cleaning liquid applied to the surface by the applicator. The wetting element distributes the cleaning liquid along at least a portion of the cleaning surface when the robot is driven in a forward direction. The wetting element is arranged substantially forward of a transverse axis defined by the right and left driven wheels, and the wetting element slidably supports at least about ten percent of the mass of the robot above the cleaning surface.
    Type: Application
    Filed: May 9, 2008
    Publication date: November 13, 2008
    Inventors: Duane L. Gilbert, JR., Marcus R. Williams, Andrea M. Okerholm, Elaine H. Kristant, Sheila A. Longo, Daniel E. Kee, Marc D. Strauss
  • Publication number: 20080272138
    Abstract: Described herein are embodiments of systems and methods for providing an automated medication handling system that can, among other things, single-dose package medications, store and dispense medications in a pharmacy, transport medications to a nursing unit or other remote location, store them at that remote location, and load them into a portable unit carried by a nurse, who may dispense the medication at a bedside.
    Type: Application
    Filed: October 12, 2007
    Publication date: November 6, 2008
    Applicant: CARDINAL HEALTH 303, INC.
    Inventors: Graham Ross, Mark Corey Yturralde
  • Publication number: 20080251351
    Abstract: A system for handling vehicle frames. The system includes a conveyor assembly including a loading end and an unloading end, and the conveyor assembly carries a plurality of vehicle frames thereon. The system also includes a frame transfer assembly disposed at the unloading end of the conveyor assembly. The frame transfer assembly may be adapted to grasp a vehicle frame from the conveyor assembly and transfer the vehicle frame from the conveyor assembly. A vehicle frame inversion system is also provided that includes a first robotic inversion device that may include a first clamping system, and a second robotic inversion device that may include a second clamping system. The first and second clamping systems are adapted to grasp a vehicle frame, and the first and second robotic inversion devices are adapted to lift and invert the vehicle frame.
    Type: Application
    Filed: April 11, 2007
    Publication date: October 16, 2008
    Inventors: James G. Powers, Allan C. McNear, Ronald M. Laux, Robert A. Florian, Paramjit S. Girn
  • Publication number: 20080252248
    Abstract: A device and a method for tool center point calibration of an industrial robot. The device is intended to calibrate an industrial robot with respect to a tool mounted on the robot. The device includes a camera designed to take a plurality of images of at least part of the robot tool for a plurality of different tool orientations, an image-processing unit designed to determine the positions of the robot tool in the orientations based on the images, a calculation module adapted to calculate the position of the center point of the robot tool, based on the determined positions, and a control module adapted to calculate the corrective movements of the robot.
    Type: Application
    Filed: January 24, 2006
    Publication date: October 16, 2008
    Applicant: ABB AB
    Inventors: Ivan Lundberg, Niklas Durinder, Torgny Brogardh
  • Publication number: 20080255703
    Abstract: A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient.
    Type: Application
    Filed: April 9, 2008
    Publication date: October 16, 2008
    Inventors: Yulun Wang, Keith Phillip Laby, Charles S. Jordan, Steven Edward Butner, Jonathan Southard
  • Publication number: 20080249663
    Abstract: A robot control information generator generates control information for operating a robot equipped with a camera and a hand to grasp an object based on a two-dimensional code on the object. The two-dimensional code includes position identifying patterns and an information pattern, the position within the two-dimensional code of each of the position-identifying patterns is specified beforehand, and the information pattern is generated by encoding of information. The robot control information generator comprises an image input unit, a pattern detection unit, a position/posture calculation unit, a decoding device, and a control information-generating unit which generates the control information based on the decoded information decoded by the decoding device and the position/posture information calculated by the position/posture calculation unit.
    Type: Application
    Filed: June 6, 2008
    Publication date: October 9, 2008
    Applicant: Honda Motor Co., Ltd.
    Inventor: Chiaki Aoyama
  • Publication number: 20080249732
    Abstract: Provided are a system, method and medium calibrating a gyrosensor of a mobile robot. The system includes a camera to obtain image data of a fixed environment, a rotation angle calculation unit to calculate a plurality of angular velocities of a mobile robot based on an analysis of the image data, a gyrosensor to output a plurality of pieces of raw data according to rotation inertia of the mobile robot and a scale factor calculation unit to calculate a scale factor that indicates the relationship between the pieces of raw data and the angular velocities.
    Type: Application
    Filed: December 6, 2007
    Publication date: October 9, 2008
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Hyoung-Ki Lee, Ki-Wan Choi, Ji-Young Park, Seok-Won Bang, Sung-Kyung Hong
  • Patent number: 7433759
    Abstract: A method for calibrating a controller of a robotic arm in a microelectronics manufacturing apparatus that includes storing a default position for an edge detector, moving a blade on the robotic arm based on the default position of the edge detector such that at least three edge points on the blade pass through and are detected by the edge detector, generating a plurality of arm position measurements from an arm position sensor by measuring a position with the arm position sensor of the robotic arm at each position of the robotic arm at which an edge point of the blade is detected by the edge detector, and determining at least one of an actual position of the edge detector and an offset for measurements of the arm position sensor based on the plurality of arm position measurements.
    Type: Grant
    Filed: July 23, 2004
    Date of Patent: October 7, 2008
    Assignee: Applied Materials, Inc.
    Inventor: Roy C. Nangoy
  • Publication number: 20080240511
    Abstract: An apparatus for picking up objects including a robot for picking up an object, at least one part of the object having a curved shape, having a storing means for storing a gray gradient distribution model of the object, a recognizing means for recognizing a gray image of the object, a gradient extracting means for extracting a gray gradient distribution from the gray image recognized by the recognizing means, an object detecting means for detecting a position or position posture of the object in the gray image in accordance with the gray gradient distribution extracted by the gradient extracting means and the gray gradient distribution model stored by the storing means, a detection information converting means for converting information of the position or position posture detected by the object detecting means into information of position or position posture in a coordinate system regarding the robot; and a robot moving means for moving the robot to the position or position posture converted by the detection
    Type: Application
    Filed: March 28, 2008
    Publication date: October 2, 2008
    Applicant: FANUC LTD
    Inventors: Kazunori BAN, Ichiro KANNO, Hidetoshi KUMIYA, Shouta TAKIZAWA
  • Patent number: 7430455
    Abstract: A robot obstacle detection system including a robot housing which navigates with respect to a surface and a sensor subsystem aimed at the surface for detecting the surface. The sensor subsystem includes an emitter which emits a signal having a field of emission and a photon detector having a field of view which intersects the field of emission at a region. The subsystem detects the presence of an object proximate the mobile robot and determines a value of a signal corresponding to the object. It compares the value to a predetermined value, moves the mobile robot in response to the comparison, and updates the predetermined value upon the occurrence of an event.
    Type: Grant
    Filed: August 6, 2007
    Date of Patent: September 30, 2008
    Assignee: iRobot Corporation
    Inventors: Christopher Casey, Matthew Cross, Daniel Ozick, Joseph L. Jones
  • Publication number: 20080229859
    Abstract: An eye module is provided. The eye module includes a casing, an eyeball and a first driving element. The eyeball having a surface is disposed in the casing. The first driving element leans against the surface and drives the eyeball to rotate by a first friction force generated by rotating the first driving element.
    Type: Application
    Filed: February 20, 2008
    Publication date: September 25, 2008
    Applicant: Qisda Corporation
    Inventors: Lin Hsiao, Wai William Wang, Fung-Hsu Wu, Chen Peng, Chung-Cheng Chou, Bow-Yi Jang, Ta-Yuan Lee
  • Publication number: 20080231866
    Abstract: A number of wafer center finding methods and systems are disclosed herein that improve upon existing techniques used in semiconductor manufacturing.
    Type: Application
    Filed: February 15, 2008
    Publication date: September 25, 2008
    Inventor: Paul E. Fogel
  • Publication number: 20080228320
    Abstract: To provide a robot whose degree of freedom of design is not limited, and which has simple structure and further reduces load of an actuator of a neck part, the present invention provides a robot at least including a head part, a body part, and a neck link which connects the head part and the body part, wherein a surrounding object distance measurement means is provided adjacently to the neck link and in an upper portion of the body part between the head part and the body part, and a distance scanning field of the surrounding object distance measurement means is provided in parallel with a horizontal plane.
    Type: Application
    Filed: January 23, 2008
    Publication date: September 18, 2008
    Inventors: Azusa Amino, Junichi Tamamoto, Ryosuke Nakamura
  • Publication number: 20080221734
    Abstract: The present invention relates to a categorical color perception system which automatically judges a categorical color and aims to judge a categorical color name correctly under various ambient lights. Test color measured at an experiment is inputted to an input layer portion corresponding to test color components 101, illumination light components at the experiment are inputted to an input layer portion corresponding to illumination light components 102, and connection weights are obtained by learning with backpropagation method so as to output a categorical color judged by an examinee.
    Type: Application
    Filed: January 23, 2006
    Publication date: September 11, 2008
    Applicant: NATIONAL UNIVERSITY CORPORATION YOKOHAMA NATIONAL UNIVERSITY
    Inventors: Tomoharu Nagao, Noriko Yata, Keiji Uchikawa
  • Patent number: 7424341
    Abstract: A robot system efficiently performing an operation of moving a robot to close to and/or separate from a target point, such as teaching operation. A camera of a visual sensor is mounted to the robot such that a distal end of an end effector is seen within the field of view of the camera, and the end effector's distal end and a target position on an object are specified on a monitor screen. When an approach key is depressed, a target position is detected on an image, and a difference from a position of the end effector's distal end is calculated. Whether the difference is within an allowable range is checked. Depending on the result, an amount of robot motion is calculated, and the robot is operated. The processing is repeated until the depressed approach key is released. When a retreat key is depressed, the robot moves away from the object. The robot may be caused to stop using a distance sensor.
    Type: Grant
    Filed: May 26, 2004
    Date of Patent: September 9, 2008
    Assignee: Fanuc Ltd
    Inventors: Atsushi Watanabe, Jun Mizuno
  • Patent number: 7424342
    Abstract: A computer in a transport system includes: a shooting part for shooting a first calibration tray by controlling a camera; a tray position computing part for computing a tray position of the first calibration tray within a captured image which the shooting part shot; a hand position acquisition part for acquiring a hand position indicative of a position of the hand robot of when the hand robot installs onto the first calibration tray a first transported article used for calibration; a calibration part for computing a calibration data based on the tray position and the hand position; and a transported article installing part which, when the mobile robot reached a predetermined arrival area, controls, based on the calibration data, the hand robot so as to install a second transported article onto a second tray, the second tray which the mobile robot being provided with.
    Type: Grant
    Filed: December 28, 2006
    Date of Patent: September 9, 2008
    Assignee: Hitachi, Ltd.
    Inventors: Fumiko Beniyama, Toshio Moriya, Nobutaka Kimura, Kosei Matsumoto
  • Publication number: 20080215184
    Abstract: A home intelligent service robot for recognizing a user and following the motion of a user and a method thereof are provided. The home intelligent service robot includes a driver, a vision processor, and a robot controller. The driver moves an intelligent service robot according to an input moving instruction. The vision processor captures images through at least two or more cameras in response to a capturing instruction for following a target object, minimizes the information amount of the captured image, and discriminates objects in the image into the target object and obstacles. The robot controller provides the capturing instruction for following the target object in a direction of collecting instruction information to the vision processor when the instruction information is collected from outside, and controls the intelligent service robot to follow and move the target object while avoiding obstacles based on the discriminating information from the vision processor.
    Type: Application
    Filed: October 31, 2007
    Publication date: September 4, 2008
    Inventors: Seung Min Choi, Ji Ho Chang, Jae Il Cho, Dae Hwan Hwang
  • Publication number: 20080215185
    Abstract: An unmanned robotic vehicle is capable of sensing an environment at a location remote from the immediate area of the vehicle frame. The unmanned robotic vehicle includes a retractable appendage with a sensing element. The sensing element can include a camera, chemical sensor, optical sensor, force sensor, or the like.
    Type: Application
    Filed: November 13, 2007
    Publication date: September 4, 2008
    Inventors: Stephen C. Jacobsen, Ralph W. Pensel, Marc Olivier
  • Patent number: 7420664
    Abstract: An apparatus and method for the remote analysis and identification of unknown compounds. A robotic arm positions a sensor on a surface. The sensor unit has a monitoring mechanism to monitor separation between the sensor unit and the surface when placed in contact with the surface to maintain the separation substantially constant. An illumination source illuminates the region of interest to produce scattered photons from an unknown compound. The scattered photons are collected by an optical system and delivered to a spectroscopic detector for analysis and identification. An algorithm is applied to the data generated by the spectroscopic detector to identify the unknown compound.
    Type: Grant
    Filed: July 12, 2006
    Date of Patent: September 2, 2008
    Assignee: ChemImage Corporation
    Inventors: Patrick J. Treado, Charles W. Gardner, Jr.
  • Publication number: 20080202202
    Abstract: A tactile sensor of an embodiment of the present invention includes: a sensing section having an elastic member at a portion which contacts a measurement target; an image acquiring section for acquiring as image information the state of a contact surface of the measurement target and the elastic member, before and after application of an external force tangential to the contact surface; a deformation analyzing section for analyzing deformation information of the contact surface, based on the image information acquired by the image acquiring section; an external force detecting section for detecting the external force applied tangential to the contact surface; and an estimating section for estimating a slippage margin between the measurement target and the elastic member, based on (I) the deformation information of the contact surface, which information acquired by the deformation analyzing section, (II) the external force detected by the external force detecting section, and (III) an object constant of the el
    Type: Application
    Filed: June 16, 2005
    Publication date: August 28, 2008
    Inventors: Jun Ueda, Yutaka Ishida, Tsukasa Ogasawara
  • Publication number: 20080201017
    Abstract: A robotic system that includes a remote controlled robot. The robot may include a camera, a monitor and a holonomic platform all attached to a robot housing. The robot may be controlled by a remote control station that also has a camera and a monitor. The remote control station may be linked to a base station that is wirelessly coupled to the robot. The cameras and monitors allow a care giver at the remote location to monitor and care for a patient through the robot. The holonomic platform allows the robot to move about a home or facility to locate and/or follow a patient.
    Type: Application
    Filed: April 16, 2008
    Publication date: August 21, 2008
    Inventors: Yulun Wang, Keith Phillip Laby, Charles S. Jordan, Steven Edward Butner, Jonathan Southard
  • Publication number: 20080201016
    Abstract: A robot has a controllable arm which carries an instrument or tool. The robot is provided with a camera to obtain an image of a work piece, including images of markers and an indicator present on the work piece. The robot processes the images to determine the position of the markers within a spatial frame of reference. The robot is controlled to effect predetermined movements of the instrument or tool relative to the work piece. The processor is further configured to determine the position of the indicator and to respond to movement of the indicator within the spatial frame of reference of the robot when the markers are concealed to determine a new position of the indicator and thus the new position of the work piece. Subsequently, the robot is controlled, relative to the new position of the work piece, to effect predetermined movements relative to the work piece.
    Type: Application
    Filed: July 6, 2006
    Publication date: August 21, 2008
    Applicant: PROSURGICS LIMITED
    Inventor: Patrick Armstrong Finlay
  • Patent number: 7412863
    Abstract: Regarding predetermined positioning criteria (M1, M2), ((G1, G2), (N1, or N2), (K1, K2)), there is provided image processing means (40B) for obtaining by image processing, measured values (CD1, CD2) ((GD1, GD2), (ND1, or ND2), (KD1, KD2)) and reference values (CR1, CR2) ((GR1, GR2), (NR1, or NR2), (KR1, KR2)), and for moving a work (W) in a manner that the measured values (CD1, CD2) ((GD1, GD2), (ND1, or ND2), (KD1, KD2)) and the reference values (CR1, CR2) ((GR1, GR2), (NR1, or NR2), (KR1, KR2)) coincide with each other, thereby positioning the work (W) at a predetermined position.
    Type: Grant
    Filed: June 18, 2002
    Date of Patent: August 19, 2008
    Assignee: Amada Co., Ltd.
    Inventors: Ichio Akami, Koichi Ishibashi, Teruyuki Kubota, Tetsuaki Kato, Jun Sato, Tetsuya Takahashi
  • Publication number: 20080151233
    Abstract: A method for optically determining whether a region of a surface is clean or contaminated. The method finds applicability in connection with cleaning robots, for example in pig house cleaning.
    Type: Application
    Filed: December 29, 2005
    Publication date: June 26, 2008
    Applicant: DANMARKS TEKNISKE UNIVERSITET
    Inventors: Mogens Blanke, Ian David Braithwaite
  • Patent number: 7389156
    Abstract: An autonomous floor cleaning robot includes a transport drive and control system arranged for autonomous movement of the robot over a floor for performing cleaning operations. The robot chassis carries a first cleaning zone comprising cleaning elements arranged to suction loose particulates up from the cleaning surface and a second cleaning zone comprising cleaning elements arraigned to apply a cleaning fluid onto the surface and to thereafter collect the cleaning fluid up from the surface after it has been used to clean the surface. The robot chassis carries a supply of cleaning fluid and a waste container for storing waste materials collected up from the cleaning surface.
    Type: Grant
    Filed: August 19, 2005
    Date of Patent: June 17, 2008
    Assignee: iRobot Corporation
    Inventors: Andrew Ziegler, Duane Gilbert, Christopher John Morse, Scott Pratt, Paul Sandin, Nancy Dussault, Andrew Jones
  • Publication number: 20080140256
    Abstract: There is provided a robot apparatus which can improve work safety while ensuring high working efficiency and a control method for the robot apparatus. The robot apparatus includes a moving body detection unit which generates a predicted moving range for a movable unit region by predicting a range within which the movable unit region moves using a plurality of images sequentially sensed by an image sensing unit and attempts to detect a moving body in the predicted moving range and a control unit which, if a moving body is detected by the moving body direction unit, changes the operation of a movable unit.
    Type: Application
    Filed: September 4, 2007
    Publication date: June 12, 2008
    Applicant: Kabushiki Kaisha Toshiba
    Inventor: Manabu Nishiyama
  • Patent number: 7386364
    Abstract: A legged mobile robot gives up a normal walking motion and starts a tumbling motion when an excessively high external force or external moment is applied thereto and a behavior plan of a foot part thereof is disabled. At this time, the variation amount ?S/?t of the area S of a support polygon of the body per time t is minimized and the support polygon when the body drops onto a floor is maximized to distribute an impact which acts upon the body from the floor when the body drops onto the floor to the whole body to suppress the damage to the body to the minimum. Further, the legged mobile robot autonomously restores a standing up posture from an on-floor posture thereof such as a supine posture or a prone posture.
    Type: Grant
    Filed: March 17, 2003
    Date of Patent: June 10, 2008
    Assignees: Sony Corporation, Yamaguchi, Jinichi
    Inventors: Tatsuo Mikami, Jinichi Yamaguchi, Atsushi Miyamoto
  • Publication number: 20080133058
    Abstract: A robot or the like capable of checking by itself whether an object is properly grasped by a hand or not is provided. It is determined whether or not the position and posture of a handle (object) determined based on an image obtained by a camera (external information) and the position and posture of the handle in the case where the handle is assumed to be properly grasped, which are determined from the posture of a robot based on an output or the like of a rotary encoder (internal information), agree with each other. In response to determination that the external information and the internal information agree with each other, it is determined whether the handle is properly grasped or not based on a force detected by a six-axis force sensor provided on each hand.
    Type: Application
    Filed: November 30, 2007
    Publication date: June 5, 2008
    Applicant: HONDA MOTOR CO., LTD.
    Inventor: Nobuyuki Ohno
  • Publication number: 20080133055
    Abstract: A gait generator determines a desired motional trajectory and a desired object reaction force trajectory of an object 120 for a predetermined period after the current time by using an object dynamic model while supplying, to the object dynamic model, a model manipulated variable (estimated disturbance force) for bringing a behavior of the object 120 on the object dynamic model close to an actual behavior, and provisionally generates a gait of a robot 1 for a predetermined period by using the aforesaid determined trajectories. Based on the gait and an object desired motion trajectory, a geometric restrictive condition, such as interference between the robot 1 and the object 120, is checked, and a moving plan for the object 120 or a gait parameter (predicted landing position/posture or the like) of the robot1 is corrected as appropriate according to a result of the check, so as to generate a gait of the robot 1.
    Type: Application
    Filed: July 28, 2005
    Publication date: June 5, 2008
    Applicant: HONDA MOTOR CO., LTD.
    Inventor: Tadaaki Hasegawa
  • Patent number: 7383100
    Abstract: An extensible task engine framework for humanoid robots. Robot instructions are stored as tasks and skills. Tasks are designed so that they can be executed by a variety of robots with differing configurations. A task can refer to zero or more skills. A skill can be designed for a particular configuration of robot. A task can be transferred from robot to robot. When executed on a particular robot, the task makes calls to one or more skills that can take advantage of the capabilities of that robot.
    Type: Grant
    Filed: September 27, 2006
    Date of Patent: June 3, 2008
    Assignee: Honda Motor Co., Ltd.
    Inventors: Victor Ng-Thow-Hing, Evan Drumwright
  • Patent number: 7365512
    Abstract: A moving-object directing system for directing a moving object to a target location includes: a moving-object directing unit and a moving object. The moving-object directing unit includes: a directing signal transmitter for generating a first moving-object directing signal, sequentially transmitting the first moving-object directing signal 360° in all directions, generating a second moving-object directing signal corresponding to directing information for directing the moving object to the target location upon receiving a signal indicating successful reception of the first moving-object directing signal, and transmitting the second moving-object directing signal, and a first RF (Radio Frequency) communication unit for receiving a signal indicating successful reception of the moving-object directing signal.
    Type: Grant
    Filed: July 20, 2006
    Date of Patent: April 29, 2008
    Assignee: LG Electronics Inc.
    Inventor: Young-gie Kim