Optical Patents (Class 901/47)
  • Publication number: 20130274923
    Abstract: Methods and apparatus for actively aligning a first optical element, such as a lens, to a second optical element, such as an image sensor, use continuous scans, even absent a synchronization signal from one of the optical elements. During a scan, timed position information about the scanned optical element is collected, and then a relationship between position of the scanned optical element and time is estimated, such as by fitting a curve to a set of position-time pairs. This relationship can then be used to estimate locations of the scanned optical element at times when image data or other alignment quality-indicating data samples are acquired. From this alignment quality versus location data, an optimum alignment position can be determined, and the scanned optical element can then be positioned at the determined alignment position.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 17, 2013
    Applicant: Automation Engineering, Inc.
    Inventor: Andre By
  • Publication number: 20130258100
    Abstract: A position and a posture of a virtual face is searched so that a total cost E of a first cost E1 and a second cost E2 is approximated to a smallest value or a minimum value. A first cost E1(e1) corresponds to a sum of elastic energy with a first deviation e1(s) as a deformation amount, of a virtual spring group having a value of a first coefficient function w1(e1(s)) in a target region of a standard image as a spring coefficient. A second cost E2(e1, e2) corresponds to a sum of elastic energy with a second deviation e2 as a deformation amount, of a virtual spring group having a value of a second coefficient function w2(e2(s) of each pixel s included in the target region of the standard image as a spring coefficient.
    Type: Application
    Filed: February 27, 2013
    Publication date: October 3, 2013
    Applicant: HONDA MOTOR CO., LTD.
    Inventor: Minami Asatani
  • Patent number: 8532819
    Abstract: Provided is a manipulator with at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as an object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where the end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where the end effector can move.
    Type: Grant
    Filed: August 21, 2012
    Date of Patent: September 10, 2013
    Assignee: Canon Kabushiki Kaisha
    Inventor: Kota Tani
  • Patent number: 8521330
    Abstract: Disclosed are a map building apparatus and method using a distance measurement. According to an aspect, by creating a first map and a second map respectively using the characteristics of different characteristic areas based on a distance-voltage characteristics of a distance measurement sensor, and combining the first map with the second map, a grid map is created. Accordingly, since a map regarding a peripheral environment is created using plural areas of the distance-voltage characteristics, a more accurate map may be created.
    Type: Grant
    Filed: December 29, 2009
    Date of Patent: August 27, 2013
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Jun-ho Park, Woo-yeon Jeong
  • Publication number: 20130218341
    Abstract: An embodiment of the invention provides a control method of a cleaning robot with a non-omnidirectional light detector. The method includes the steps of: detecting a light beam via the non-omnidirectional light detector; stopping the cleaning robot and spinning the non-omnidirectional light detector when the non-omnidirectional light detector detects the light beam; stopping the spinning of the non-omnidirectional light detector and estimating a first spin angle when the non-omnidirectional light detector does not detect the light beam; and adjusting a moving direction of the cleaning robot according to the first spin angle.
    Type: Application
    Filed: February 15, 2013
    Publication date: August 22, 2013
    Applicant: MICRO-STAR INTERNATIONAL COMPANY LIMITED
    Inventor: Micro-Star International Company Limited
  • Publication number: 20130218324
    Abstract: An assembling device which does not use a positioning device for articles to be assembled, nor a feeder mechanism and a pneumatic mechanism for a bolt for assembling. The article assembling device includes a first robot which takes out and aligns a bolt by using a detection result of a first camera; a second robot which grips a first article by using a detection result of a second camera and conveys the first article to a position where the first article can be attached to a second article; and a bolt holding and fastening section which fastens the articles by using the aligned bolt.
    Type: Application
    Filed: December 13, 2012
    Publication date: August 22, 2013
    Applicant: FANUC CORPORATION
    Inventor: FANUC CORPORATION
  • Patent number: 8515578
    Abstract: A navigational control system for altering movement activity of a robotic device operating in a defined working area, comprising a transmitting subsystem integrated in combination with the robotic device, the transmitting subsystem comprising means for emitting a number of directed beams, each directed beam having a predetermined emission pattern, and a receiving subsystem functioning as a base station that includes a navigation control algorithm that defines a predetermined triggering event for the navigational control system and a set of detection units positioned within the defined working area in a known spaced-apart relationship, the set of detection units being configured and operative to detect one or more of the directed beams emitted by the transmitting system; and wherein the receiving subsystem is configured and operative to process the one or more detected directed beams under the control of the navigational control algorithm to determine whether the predetermined triggering event has occurred, an
    Type: Grant
    Filed: December 13, 2010
    Date of Patent: August 20, 2013
    Assignee: iRobot Corporation
    Inventors: Mark J. Chiappetta, Joseph L. Jones
  • Patent number: 8515580
    Abstract: Described herein are technologies pertaining to autonomously docking a mobile robot at a docking station for purposes of recharging batteries of the mobile robot. The mobile robot uses vision-based navigation and a known map of the environment to navigate toward the docking station. Once sufficiently proximate to the docking station, the mobile robot captures infrared images of the docking station, and granularly aligns itself with the docking station based upon the captured infrared images of the docking station. As the robot continues to drive towards the docking station, the robot monitors infrared sensors for infrared beams emitted from the docking station. If the infrared sensors receive the infrared beams, the robot continues to drive forward until the robot successfully docks with the docking station.
    Type: Grant
    Filed: June 17, 2011
    Date of Patent: August 20, 2013
    Assignee: Microsoft Corporation
    Inventors: Trevor Taylor, Michael Wyrzykowski, Glen C. Larsen, Mike M. Paull
  • Publication number: 20130211594
    Abstract: A system for controlling a human-controlled proxy robot surrogate is presented. The system includes a plurality of motion capture sensors for monitoring and capturing all movements of a human handler such that each change in joint angle, body posture or position; wherein the motion capture sensors are similar in operation to sensors utilized in motion picture animation, suitably modified to track critical handler movements in near real time. A plurality of controls attached to the proxy robot surrogate is also presented that relays the monitored and captured movements of the human handler as “follow me” data to the proxy robot surrogate in which the plurality of controls are configured such that the proxy robot surrogate emulates the movements of the human handler.
    Type: Application
    Filed: August 24, 2012
    Publication date: August 15, 2013
    Inventor: Kenneth Dean Stephens, JR.
  • Publication number: 20130211587
    Abstract: A system and method of space exploration with a human-controlled proxy robot surrogates is disclosed. The method includes: training the human controlled proxy robot surrogates using human handlers; controlling the human-controlled proxy robot surrogates using the human handlers; and deploying a plurality of human-controlled proxy robot surrogates for extraterrestrial missions, missions on Earth, the Moon, and near-Earth locations. Each of the human-controlled proxy robot surrogates are in communication with each of the human handlers and wherein each one of the plurality of proxy robot surrogates is paired with each one of the plurality of human handlers. The human-controlled proxy robot surrogates further comprise an artificial intelligence (AI). The artificial intelligence of the disclosed method includes learned behavior.
    Type: Application
    Filed: May 23, 2012
    Publication date: August 15, 2013
    Inventor: Kenneth Dean Stephens, JR.
  • Patent number: 8509941
    Abstract: The present invention relates to a method and a device for the machining of an object using a tool, in which the tool (2) or the object (18) is guided using a handling apparatus, which has multiple movement axes for the coarse positioning of the tool (2) or object (18), which form a kinematic chain. In the method, an additional actuator (3), which has a higher positioning precision in at least one dimension or axis than the other movement axes, is inserted between a terminal link (1) of the kinematic chain and the tool (2) or object (18). A relative movement of the tool (2) or terminal link (1) of the kinematic chain to the object (18) is detected using at least one sensor (5) and a deviation from a target movement path is compensated for using the additional actuator (3). The method and the associated device allow the use of robots or other handling apparatuses having lower path precision for applications which require a high precision during the guiding of the tool.
    Type: Grant
    Filed: October 18, 2007
    Date of Patent: August 13, 2013
    Assignees: Fraunhofer-Gesellschaft zur Foerderung der angewandten Forschung e.V., Rheinisch-Westfaelische Technische Hochschule Aachen
    Inventors: Boris Regaard, Stefan Kaierle
  • Patent number: 8509949
    Abstract: The inventive concept of the metrology system (the system) actively determines the 6 Degree of Freedom (6-DOF) pose of a motion device such as, but not limited to, an industrial robot employing an end of arm tool (EOAT). A concept of the system includes using laser pointing devices without any inherent ranging capability in conjunction with the EOAT-mounted targets to actively determine the pose of the EOAT at distinct work positions of at least one motion device.
    Type: Grant
    Filed: March 23, 2009
    Date of Patent: August 13, 2013
    Assignee: Variation Reduction Solutions, Inc.
    Inventors: Brett Alan Bordyn, Myles Daniel Markey, Michael John Kleeman
  • Publication number: 20130204436
    Abstract: An apparatus for controlling a robot capable of controlling the motion of the arm of the robot, and a control method thereof, the apparatus including an image obtaining unit configured to obtain a three-dimensional image of a user, a driving unit configured to drive an arm of the robot that is composed of a plurality of segments, and a control unit configured to generate a user model that corresponds to a motion of the joint of the user based on the three-dimensional image, to generate a target model having a length of the segment that varies based on the user model, and to allow the arm of the robot to be driven based on the target model.
    Type: Application
    Filed: February 1, 2013
    Publication date: August 8, 2013
    Applicant: SAMSUNG ELECTRONICS CO., LTD
    Inventor: SAMSUNG ELECTRONICS CO., LTD
  • Patent number: 8494678
    Abstract: A process for working a contour on at least one workpiece using a robot includes positioning the workpiece relative to the robot; acquiring an actual position of the workpiece; acquiring a real course of the contour on the workpiece at predefined points using at least one sensor; and actuating the robot according to individual vectors so as to correct a robot motion during the working of the contour.
    Type: Grant
    Filed: June 14, 2008
    Date of Patent: July 23, 2013
    Assignee: ABB AG
    Inventors: Stefan Quandt, Andreas Hoffmann, Joerg Reger
  • Publication number: 20130182903
    Abstract: A robot apparatus includes a reference-model storing unit configured to store a reference model of an object, a feature-value-table storing unit configured to store a feature value table that associates position data and orientation data of the reference model and a feature value, a photographed-image acquiring unit configured to capture a photographed image of the object, a detecting unit configured to calculate a photographed image feature value from the photographed image, and a driving control unit configured to control a robot main body on the basis of the position data and the orientation data to change the position and the orientation of a gripping unit.
    Type: Application
    Filed: January 17, 2013
    Publication date: July 18, 2013
    Applicant: SEIKO EPSON CORPORATION
    Inventor: SEIKO EPSON CORPORATION
  • Publication number: 20130184849
    Abstract: Methods for automated handling for forming a device and automated handling systems for forming a device are presented. One of the methods includes providing a production area with a plurality of destinations and a transport system which includes transport and load/unload (U/L) units in the production area. The transport units include automated guided vehicles (AGVs) with a storage compartment for holding at least one carrier containing production material for forming the device and U/L units include AGVs with a robotic system for handling carriers. A transfer of a selected carrier from a first destination to a second destination is determined. A request to the transport system is issued to effect the transfer of the selected carrier, which includes using a selected U/L unit, a selected transport unit, or a combination of selected U/L and transport units.
    Type: Application
    Filed: January 13, 2012
    Publication date: July 18, 2013
    Applicant: GLOBALFOUNDRIES Singapore Pte. Ltd.
    Inventor: Chew Foo CHAN
  • Patent number: 8483881
    Abstract: A multi-function robotic device may have utility in various applications. In accordance with one aspect, a multi-function robotic device may be selectively configurable to perform a desired function in accordance with the capabilities of a selectively removable functional cartridge operably coupled with a robot body. Localization and mapping techniques may employ partial maps associated with portions of an operating environment, data compression, or both.
    Type: Grant
    Filed: September 1, 2006
    Date of Patent: July 9, 2013
    Assignee: Neato Robotics, Inc.
    Inventors: Vladimir Ermakov, Mark Woodward, Joe Augenbraun
  • Publication number: 20130173043
    Abstract: A robotic apparatus for machining tenons on turbine buckets of a steam turbine machine is disclosed. The robotic apparatus includes a machining device having a spindle head. A robotic arm is coupled to the machining device and a base member is coupled to the robotic arm. The base member is mounted independently of the machine element. A vision system is provided for locating the tenon on the turbine bucket. A control system is coupled to the vision system, the machining device and the robotic apparatus. The control system is configured to control movement of the robotic apparatus and the machining device based upon vision system data and spatial information about the tenon and the turbine bucket.
    Type: Application
    Filed: January 4, 2012
    Publication date: July 4, 2013
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Todd George KUDAS, Brian David ALBIN
  • Publication number: 20130166070
    Abstract: Methods of and a system for providing force information for a robotic surgical system. The method includes storing first kinematic position information and first actual position information for a first position of an end effector; moving the end effector via the robotic surgical system from the first position to a second position; storing second kinematic position information and second actual position information for the second position; and providing force information regarding force applied to the end effector at the second position utilizing the first actual position information, the second actual position information, the first kinematic position information, and the second kinematic position information. Visual force feedback is also provided via superimposing an estimated position of an end effector without force over an image of the actual position of the end effector. Similarly, tissue elasticity visual displays may be shown.
    Type: Application
    Filed: January 8, 2013
    Publication date: June 27, 2013
    Applicant: Intuitive Surgical Operations, Inc.
    Inventor: Intuitive Surgical Operations, Inc.
  • Publication number: 20130163853
    Abstract: A method for estimating a location of a device uses a color image and a depth image. The method includes matching the color image to the depth image, generating a 3D reference image based on the matching, generating a 3D object image based on the matching, extracting a 2D reference feature point from the reference image, extracting a 2D reference feature point from the object image, matching the extracted reference feature point from the reference image to the extracted reference feature point from the object image, extracting a 3D feature point from the object image using the matched 2D reference feature point, and estimating the location of the device based on the extracted 3D feature point.
    Type: Application
    Filed: December 21, 2012
    Publication date: June 27, 2013
    Applicant: Samsung Electronics Co., Ltd.
    Inventor: Samsung Electronics Co., Ltd.
  • Publication number: 20130166061
    Abstract: An object gripping apparatus comprises an imaging unit configured to capture an image of a plurality of workpieces; a workpiece state estimation unit configured to estimate positions and orientations of the plurality of workpieces from the captured image; a pickup-target workpiece selection unit configured to select a pickup-target workpiece from among the plurality of workpieces based on a result of the estimation of the workpiece state estimation unit; a workpiece pickup unit configured to grip and pick up the pickup-target workpiece in accordance with an operation path associated with a position of the pickup-target workpiece; and a path setting unit configured to determine an evacuation path along which the workpiece pickup unit that has gripped the pickup-target workpiece evacuates to the outside of an imaging range of the imaging unit based on an estimated moving time required for the evacuation.
    Type: Application
    Filed: December 10, 2012
    Publication date: June 27, 2013
    Applicant: CANON KABUSHIKI KAISHA
    Inventor: CANON KABUSHIKI KAISHA
  • Publication number: 20130158709
    Abstract: A system for a work cell having a carrier that moves a product along an assembly line includes an assembly robot, sensor, and controller. An arm of the robot moves on the platform adjacent to the carrier. The sensor measures a changing position of the carrier and encodes the changing position as a position signal. The controller receives the position signal and calculates a lag value of the robot with respect to the carrier using the position signal. The controller detects a requested e-stop of the carrier when the arm and product are in mutual contact, and selectively transmits a speed signal to the robot to cause a calibrated deceleration of the platform before executing the e-stop event. This occurs only when the calculated tracking position lag value is above a calibrated threshold. A method is also disclosed for using the above system in the work cell.
    Type: Application
    Filed: December 14, 2011
    Publication date: June 20, 2013
    Applicant: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: Jianying Shi, David Groll, Peter W. Tavora
  • Publication number: 20130155226
    Abstract: Disclosed are an object tracking system using a robot and an object tracking method using a robot. The present invention provides an object tracking system using a robot and an object tracking method using a robot capable of continuously performing object tracking without missing the corresponding object even when the object deviates from a viewing angle of a camera, in tracking an image based object (person) using a robot.
    Type: Application
    Filed: July 10, 2012
    Publication date: June 20, 2013
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Dae Ha LEE, Min Su JANG, Seo Hyun JEON, Young Jo CHO, Jae Hong KIM
  • Publication number: 20130158707
    Abstract: A method of generating a behavior of a robot includes measuring input data associated with a plurality of user responses, applying an algorithm to the input data of the plurality of user responses to generate a plurality of user character classes, storing the plurality of user character classes in a database, classifying an individual user into a selected one of the plurality of user character classes by generating user preference data, selecting a robot behavior based on the selected user character class, and controlling the actions of the robot in accordance with the selected robot behavior during a user-robot interaction session. The selected user character class and the user preference data are based at least in part on input data associated with the individual user.
    Type: Application
    Filed: December 14, 2011
    Publication date: June 20, 2013
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Haeyeon Lee, Yasuhiro Ota, Cynthia Breazeal, Jun Ki Lee
  • Publication number: 20130158748
    Abstract: A control device for a mobile robot, the robot comprising a camera and a communication unit, comprises a display unit displaying an image corresponding to the image taken by the camera and transmitted by the communication unit, and a user interface, wherein the user interface is configured to allow a user to control a position of a pointer on the image displayed by the display unit, and the user interface comprises a selection unit allowing the user to select a position of the pointer on the image displayed by the display unit, the control device further comprising a computation unit and a communication system, the computation unit being configured to compute displacement commands and send them through the communication system to the robot, said displacement commands being computed to make the robot move to a physical position corresponding to the pointer's position selected by the user on the image.
    Type: Application
    Filed: September 5, 2011
    Publication date: June 20, 2013
    Applicant: ALDEBARAN ROBOTICS
    Inventor: Jean-Christophe Baillie
  • Publication number: 20130158711
    Abstract: An acoustic pretouch sensor or proximity sensor (110) includes a cavity (104) with a first microphone (106) disposed therein, and optionally a second microphone (108) disposed outside of the cavity. A processing system (110) receives the signals generated by the first microphone and analyzes the spectrum to produce a result representing the resonant frequency of the cavity. The processing system may optionally subtract the second microphone signal spectrum from the first to automatically compensate for changes in ambient noise. The processing system uses the resonant frequency to estimate the distance from the cavity opening to a surface (90). For example, the pretouch sensors may be incorporated into a stand alone device (100), into a robotic end effector (204), or into a device such as a phone (300).
    Type: Application
    Filed: October 29, 2012
    Publication date: June 20, 2013
    Applicant: University of Washington through its Center for Commercialization
    Inventor: University of Washington through its Center for Commercialization
  • Publication number: 20130158710
    Abstract: A taking out device capable of correcting a posture of an article to be taken out and taking out the article, while considering interference between a robot hand and a container for containing the article. Since the article is inclined to the left side, the hand approaches and contacts the article from the left side. Then, the hand pushes to the right side while claws of the hand engage a hole portion of the article in order to correct the posture of the article such that the positional relationship between the article and the hand represents a reference position/posture. In this way, the hand is positioned at a second position/posture in which the posture of the article relative to the claws allows the article to be taken out.
    Type: Application
    Filed: October 26, 2012
    Publication date: June 20, 2013
    Applicant: FANUC CORPORATION
    Inventor: FANUC CORPORATION
  • Publication number: 20130147944
    Abstract: A vision-guided alignment system to align a plurality of components includes a robotic gripper configured to move one component relative to another component and a camera coupled to a processor that generates an image of the components. A simulated robotic work cell generated by the processor calculates initial calibration positions that define the movement of the robotic gripper such that position errors between the actual position of the robotic gripper and the calibration positions are compensated by a camera space manipulation based control algorithm executed by the processor to control the robotic gripper to move one component into alignment with another component based on the image of the components.
    Type: Application
    Filed: August 25, 2011
    Publication date: June 13, 2013
    Applicant: ABB RESEARCH LTD
    Inventors: Biao Zhang, Jianjun Wang, Gregory F. Rossano, Thomas A. Fuhlbrigge
  • Publication number: 20130138248
    Abstract: Systems and methods are provided for controlling a multiple degree-of-freedom system. Plural stimuli are provided to a user, and steady state visual evoked response potential (SSVEP) signals are obtained from the user. The SSVEP signals are processed to generate a system command. Component commands are generated based on the system command, the plurality of components commands causing the multiple degree-of-freedom system to implement the system command.
    Type: Application
    Filed: November 30, 2011
    Publication date: May 30, 2013
    Applicant: HONEYWELL INTERNATIONAL INC.
    Inventors: Santosh Mathan, Kevin J. Conner, Deniz Erdogmus
  • Publication number: 20130138247
    Abstract: Vector Field SLAM is a method for localizing a mobile robot in an unknown environment from continuous signals such as WiFi or active beacons. Disclosed is a technique for localizing a robot in relatively large and/or disparate areas. This is achieved by using and managing more signal sources for covering the larger area. One feature analyzes the complexity of Vector Field SLAM with respect to area size and number of signals and then describe an approximation that decouples the localization map in order to keep memory and run-time requirements low. A tracking method for re-localizing the robot in the areas already mapped is also disclosed. This allows to resume the robot after is has been paused or kidnapped, such as picked up and moved by a user. Embodiments of the invention can comprise commercial low-cost products including robots for the autonomous cleaning of floors.
    Type: Application
    Filed: November 9, 2012
    Publication date: May 30, 2013
    Inventors: Jens-Steffen Gutmann, Philip Fong, Mario E. Munich
  • Publication number: 20130138246
    Abstract: Vector Field SLAM is a method for localizing a mobile robot in an unknown environment from continuous signals such as WiFi or active beacons. Disclosed is a technique for localizing a robot in relatively large and/or disparate areas. This is achieved by using and managing more signal sources for covering the larger area. One feature analyzes the complexity of Vector Field SLAM with respect to area size and number of signals and then describe an approximation that decouples the localization map in order to keep memory and run-time requirements low. A tracking method for re-localizing the robot in the areas already mapped is also disclosed. This allows to resume the robot after is has been paused or kidnapped, such as picked up and moved by a user. Embodiments of the invention can comprise commercial low-cost products including robots for the autonomous cleaning of floors.
    Type: Application
    Filed: November 9, 2012
    Publication date: May 30, 2013
    Inventors: Jens-Steffen Gutmann, Dhiraj Goel, Mario E. Munich
  • Publication number: 20130128035
    Abstract: An analytical laboratory system and method for processing samples is disclosed. A sample container is transported from an input area to a distribution area by a gripper comprising a means for inspecting a tube. An image is captured of the sample container. The image is analyzed to determine a sample container identification. A liquid level of the sample in the sample container is determined. A scheduling system determines a priority for processing the sample container based on the sample container identification. The sample container is transported from the distribution area to a subsequent processing module by the gripper.
    Type: Application
    Filed: November 7, 2012
    Publication date: May 23, 2013
    Applicant: Beckman Coulter, Inc.
    Inventor: Beckman Coulter, Inc.
  • Publication number: 20130123986
    Abstract: A handheld tool is disclosed which may be used to transfer a plurality of plant tissue explants from a first container to a second container. The handheld tool may include a disposable tip member which couples the plurality of plant tissue explants through use of negative pressure. An automated system which transfers a plurality of plant tissue explants from a first container to a second container is also disclosed. The automated system may include a first presentment system which moves the first container to a region, a second presentment system which moves the second container to the region, and a robot system that transfers the plurality of plant tissue explants from the first container to the second container.
    Type: Application
    Filed: December 31, 2012
    Publication date: May 16, 2013
    Applicant: Dow Agrosciences LLC
    Inventor: Tonya Lynne Strange Moynahan
  • Publication number: 20130123987
    Abstract: A robotic system includes: a detection unit that detects at least one of a voice, light and an image of a content outputted by a content output device; a decision unit that assesses information detected by the detection unit on the basis of reference data so as to assess the content outputted by the content output device; and a control unit that controls a behavior or a state of the robotic system on the basis of the assessment made by the decision unit.
    Type: Application
    Filed: January 8, 2013
    Publication date: May 16, 2013
    Applicant: PANASONIC CORPORATION
    Inventor: PANASONIC CORPORATION
  • Publication number: 20130116825
    Abstract: Disclosed are a robot cleaner and a method for controlling the same. A plurality of images are detected through an image detecting unit such as an upper camera, and two or more feature points are extracted from the plurality of images. Then, a feature point set consisting of the feature points is created, and the feature points included in the feature point set are matched with each other. This may allow the robot cleaner to precisely recognize a position thereof. Furthermore, this may allow the robot cleaner to perform a cleaning operation or a running operation by interworking a precisely recognized position with a map.
    Type: Application
    Filed: July 5, 2011
    Publication date: May 9, 2013
    Inventors: Yiebin Kim, Seungmin Baek, Yoojin Choi, Vadim Lutsiv, Victor Redkov, Alexey Potapov, Seongsoo Lee
  • Publication number: 20130116826
    Abstract: Disclosed are a robot cleaner and a method for controlling the same. The robot cleaner is capable of recognizing a position thereof by extracting one or more feature points having 2D coordinates information with respect to each of a plurality of images, by matching the feature points with each other, and then by creating a matching point having 3D coordinates information. Matching points having 3D coordinates information are created to recognize a position of the robot cleaner, and the recognized position is verified based on a moving distance measured by using a sensor. This may allow a position of the robot cleaner to be precisely recognized, and allow the robot cleaner to perform a cleaning operation or a running operation by interworking the precisely recognized position with a map.
    Type: Application
    Filed: July 5, 2011
    Publication date: May 9, 2013
    Inventors: Yiebin Kim, Seung min Baek, Yoojin Choi, Vadim Lutsiv, Victor Redkov, Alexey Potapov, Seongsoo Lee
  • Publication number: 20130110281
    Abstract: An automated system for transporting items between variable endpoints includes a guidance system for identifying the endpoints and at least one autonomous mobile robot interacting with the guidance system for automatically moving items between the endpoints. The at least one robot is configured to (a) collect an item to be transported at a source end point, (b) travel to a destination endpoint utilizing the guidance system to locate the destination endpoint, (c) deliver the item to the destination endpoint, and (d) repeat (a) through (c) for a given set of items. The guidance system is dynamically reconfigurable to identify new endpoints.
    Type: Application
    Filed: October 31, 2011
    Publication date: May 2, 2013
    Applicant: Harvest Automation, Inc.
    Inventors: Joseph L. Jones, Clara Vu, Paul E. Sandin, Charles M. Grinnell
  • Publication number: 20130103179
    Abstract: A robot system includes transport means which transports an object, first detecting means which detects a three-dimensional shape of the object transported on a transport path by the transport means, a robot which performs a predetermined task on the object transported on the transport path by the transport means, means which generates an operation command to the robot, and means which corrects the operation command based on a detection result by the first detecting means.
    Type: Application
    Filed: February 13, 2012
    Publication date: April 25, 2013
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Tetsuya MIYOSHI
  • Publication number: 20130094656
    Abstract: A method for automatic audio volume control on a robot is presented. The robot can deliver its audio output at a comfortable and intelligible level to the user according to user's distance and background noise intensity in user's environment. The user's distance is estimated, by using a camera with known focal length and resolution, by using a stereo camera with known focal length and distance between lenses, or by using an electronic ranging device. Background noise intensity is measured by using a microphone and digital signal processing techniques. The audio output volume is adjusted considering the effect of signal attenuation over user's distance and the effect of background noise. The audio output volume adjustment mechanism can be close-looped, based on the measured signal to noise ratio of acoustic echo of the audio output.
    Type: Application
    Filed: October 16, 2011
    Publication date: April 18, 2013
    Inventor: Hei Tao Fung
  • Publication number: 20130085605
    Abstract: A robot system includes a container, a disposed-state detector, and a robot arm. The container is configured to accommodate a plurality of to-be-held objects and includes a reticulated portion. The disposed-state detector is configured to detect disposed states of the plurality of respective to-be-held objects disposed in the container. The robot arm includes a holder configured to hold a to-be-held object among the plurality of to-be-held objects based on the disposed states of the plurality of respective to-be-held objects detected by the disposed-state detector.
    Type: Application
    Filed: October 4, 2012
    Publication date: April 4, 2013
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: KABUSHIKI KAISHA YASKAWA DENKI
  • Publication number: 20130085595
    Abstract: A device is provided having a robotic arm for handling a wafer, the robotic arm including one or more encoders that provide encoder data identifying a position of one or more components of the robotic arm. The device also having a processor adapted to apply an extended Kalman Filter to the encoder data to estimate a position of the wafer.
    Type: Application
    Filed: September 14, 2012
    Publication date: April 4, 2013
    Inventors: Christopher C. Kiley, Peter van der Meulen, Forrest T. Buzan, Paul E. Fogel
  • Publication number: 20130076893
    Abstract: An obstacle sensor includes a line light irradiating unit including a light-emitting unit, a light-emitting driving unit to drive the light-emitting unit, and a first conical mirror, an apex of which is disposed towards the light-emitting unit in a light irradiation direction of the light-emitting unit and which converts light emitted from the light-emitting unit into line light irradiated in all directions, and a reflected light receiving unit including a second conical mirror to condense light, that is irradiated from the first conical mirror and is then reflected from an obstacle, a lens, that is spaced from the apex of the second conical mirror by a predetermined distance and transmits the reflected light, an imaging unit to image the reflected light that passes through the lens, an image processing unit, and an obstacle sensing control unit.
    Type: Application
    Filed: September 14, 2012
    Publication date: March 28, 2013
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Yeon Kyu JEONG, Shin KIM, Jeong Hun KIM, Jong Owan KIM, Sang Sik YOON, Dong Hun LEE, Jea Yun SO
  • Publication number: 20130078385
    Abstract: A modular tire spraying system includes a downdraft spray booth for receiving a tire, a fluid delivery system disposed in the spray booth, a robot for transporting the tire to the spray booth, and a platform on which each of the spray booth, the fluid delivery system, and the robot is disposed. The fluid delivery system includes at least one spray gun for delivering a coating to the tire.
    Type: Application
    Filed: September 26, 2011
    Publication date: March 28, 2013
    Inventor: Todd E. Hendricks, SR.
  • Publication number: 20130076902
    Abstract: A robotic charging station for charging a battery of an electric vehicle includes a base plate, a riser coupled with the base plate and extending substantially transverse to the base plate, and a robotic arm. The robotic arm extends from the riser and supports an end effector. The end effector includes a plurality of electrical contacts configured to couple with a receptacle disposed on the electric vehicle. The robotic arm is configured to move the end effector in three degrees of motion.
    Type: Application
    Filed: May 31, 2012
    Publication date: March 28, 2013
    Applicants: Universite Laval, GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: Dalong Gao, Neil David Mc Kay, Matthew J. Reiland, Simon Foucault, Marc-Antoine Lacasse, Thierry Laliberte, Boris Mayer-St-Onge, Alexandre Lecours, Clement Gosselin, David E. Milburn, Linda Y. Harkenrider
  • Publication number: 20130073087
    Abstract: A method for controlling a robotic apparatus to produce desirable photographic results. The method includes, with a motor controller, first operating a robotics assembly to animate the robotic apparatus and, then, detecting an upcoming image capture. The method further includes, with the motor controller in response to the detecting of the upcoming image capture, second operating the robotics assembly to pose the robotic apparatus for the upcoming image capture. In some embodiments, the detecting includes a sensor mounted on the robotic apparatus sensing a pre-flash of light from a red-eye effect reduction mechanism of a camera. In other cases, the detecting includes a sensor mounted on the robotics apparatus sensing a range finder signal from a range finder of a camera. The posing may include opening eyes, moving a mouth into a smile, or otherwise striking a pose that is held temporarily to facilitate image capture with a camera.
    Type: Application
    Filed: September 20, 2011
    Publication date: March 21, 2013
    Applicant: DISNEY ENTERPRISES, INC.
    Inventors: Holger Irmler, Asa Kalama
  • Publication number: 20130073089
    Abstract: A robot system includes: an imaging unit including an imaging device and a distance measuring part; and a robot to which the imaging unit is attached. The imaging device preliminarily images a workpiece. The robot preliminarily moves the imaging unit based on the result of the preliminary imaging. The distance measuring part measures the distance to the workpiece. The robot actually moves the imaging unit based on the result of the measurement. The imaging device actually images the workpiece.
    Type: Application
    Filed: September 4, 2012
    Publication date: March 21, 2013
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Yoshimitsu NAKAHARA
  • Patent number: 8401700
    Abstract: The lower arm assembly for a humanoid robot includes an arm support having a first side and a second side, a plurality of wrist actuators mounted to the first side of the arm support, a plurality of finger actuators mounted to the second side of the arm support and a plurality of electronics also located on the first side of the arm support.
    Type: Grant
    Filed: September 22, 2009
    Date of Patent: March 19, 2013
    Assignees: GM Global Technology Operations LLC, The United States of America as represented by the Administrator of the National Aeronautics and Space Administration
    Inventors: Chris A. Ihrke, Lyndon Bridgwater, Myron A. Diftler, David M. Reich, Scott R. Askew
  • Patent number: 8396597
    Abstract: The different illustrative embodiments provide an apparatus that includes a computer system, a number of structured light generators, and a number of mobile robotic devices. The computer system is configured to generate a path plan. The number of structured light generators is configured to project the path plan. The number of mobile robotic devices is configured to detect and follow the path plan.
    Type: Grant
    Filed: August 18, 2009
    Date of Patent: March 12, 2013
    Assignee: Deere & Company
    Inventor: Noel Wayne Anderson
  • Patent number: 8392021
    Abstract: An autonomous floor cleaning robot includes a transport drive and control system arranged for autonomous movement of the robot over a floor for performing cleaning operations. The robot chassis carries a first cleaning zone comprising cleaning elements arranged to suction loose particulates up from the cleaning surface and a second cleaning zone comprising cleaning elements arraigned to apply a cleaning fluid onto the surface and to thereafter collect the cleaning fluid up from the surface after it has been used to clean the surface. The robot chassis carries a supply of cleaning fluid and a waste container for storing waste materials collected up from the cleaning surface.
    Type: Grant
    Filed: August 19, 2005
    Date of Patent: March 5, 2013
    Assignee: iRobot Corporation
    Inventors: Stefanos Konandreas, Andrew Ziegler, Christopher John Morse
  • Patent number: 8392023
    Abstract: A robotic system includes a robot for moving a payload in response to a calculated input force. Sensors in respective sensor housings are connected to a handle, each sensor including a light emitter and receiver. The sensors measure a light beam received by a respective receiver. A controller calculates the calculated input force using received light. Each sensor housing modifies an interruption of the light beam in a sensor when the actual input force is applied, and the controller controls the robot using the calculated input force. A method of controlling the robot includes emitting the light beam, flexing a portion of the sensor housing(s) using the actual input force to interrupt the light beam, and using a host machine to calculate the calculated input force as a function of the portion of the light beam received by the light receiver. The robot is controlled using the calculated input force.
    Type: Grant
    Filed: November 30, 2009
    Date of Patent: March 5, 2013
    Assignees: GM Global Technology Operations LLC, Universite Laval
    Inventors: Vincent Duchaine, Noemie Paradis, Thierry Laliberte, Boris Mayer-St-Onge, Clement Gosselin, Dalong Gao