Optical Patents (Class 901/47)
  • Publication number: 20140180479
    Abstract: Systems and methods are disclosed for automatically or semi-automatically depositing retail items into a bag using a robotic arm. Embodiments of the present disclosure comprise a camera, an image processor, and a robotic arm control module to analyze and attempt to identify an item. Human intervention may be utilized to assist in item identification and/or robotic arm control. A human operator may visually identify the item and/or remotely control the robotic arm from a remote control station.
    Type: Application
    Filed: December 20, 2012
    Publication date: June 26, 2014
    Applicant: Wal-Mart Stores, Inc.
    Inventors: Stuart Argue, Anthony Emile Marcar
  • Patent number: 8761932
    Abstract: A automation equipment control system comprises a general purpose computer with a general purpose operating system in electronic communication with a real-time computer subsystem. The general purpose computer includes a program execution module to selectively start and stop processing of a program of equipment instructions and to generate a plurality of move commands. The real-time computer subsystem includes a move command data buffer for storing the plurality of move commands, a move module linked to the data buffer for sequentially processing the moves and calculating a required position for a mechanical joint. The real-time computer subsystem also includes a dynamic control algorithm in software communication with the move module to repeatedly calculate a required actuator activation signal from a joint position feedback signal.
    Type: Grant
    Filed: June 18, 2013
    Date of Patent: June 24, 2014
    Assignee: C.H.I. Development Mgmt. Ltd. XXIV, LLC
    Inventor: John R. Lapham
  • Publication number: 20140172166
    Abstract: The treatment device for hemiplegia comprises a robot which is putted on the hemiplegic side of the body of a subject; a motion measurement unit for measuring the motion of the healthy side of the body of the subject; and a control which is connected with the robot and the motion measurement unit, wherein the control unit is configured to receive the healthy side's motion measured by the motion measurement unit and to control the robot, whereby the hemiplegic side having the robot put thereon moves in accordance with the motion of the healthy side of the body.
    Type: Application
    Filed: August 29, 2013
    Publication date: June 19, 2014
    Applicant: SNU R&DB Foundation
    Inventors: Sung Wan Kim, Sun Gun Chung, Hee Chan Kim, Jae Won Beom, Hyung Seok Nam, Chi Won Lee
  • Publication number: 20140158052
    Abstract: A system includes a milking box and a robotic attacher. The milking box has a stall to accommodate a dairy livestock. The robotic attacher extends under the dairy livestock and comprises a nozzle. The robotic attacher is operable to rotate such that, during a first operation, the nozzle is positioned generally on the bottom of the robotic attacher, and during a second operation, the nozzle is positioned generally on the top of the robotic attacher.
    Type: Application
    Filed: February 12, 2014
    Publication date: June 12, 2014
    Applicant: TECHNOLOGIES HOLDINGS CORP.
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Publication number: 20140158063
    Abstract: A method comprises determining a tangent to the rear of an udder of a dairy livestock, and determining a tangent to the bottom of the udder of the dairy livestock. The method continues by determining a position relative to the intersection of the two tangents, and extending a robot arm to the determined position.
    Type: Application
    Filed: February 17, 2014
    Publication date: June 12, 2014
    Applicant: TECHNOLOGIES HOLDINGS CORP
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Publication number: 20140156125
    Abstract: An autonomous electronic apparatus and a navigation method thereof are provided. The navigation method includes the following steps. Firstly, a calling signal from a target is received through a wireless sensor network. A position relationship between the target and the autonomous electronic apparatus is analyzed to generate a first speed. Next, an image set is captured and an image relationship between the image set and the target is analyzed to generate a second speed. Afterwards, a weighting value related to the position relationship is calculated. Besides, a moving speed is calculated according to the weighting value, the first speed and the second speed, and a moving status of the autonomous electronic apparatus moving toward the target is controlled via the moving speed.
    Type: Application
    Filed: March 31, 2013
    Publication date: June 5, 2014
    Applicant: National Chiao Tung University
    Inventors: Kai-Tai Song, Shang-Chun Hung
  • Publication number: 20140156072
    Abstract: A measurement apparatus for determining a position of a tool center point (31) of a tool (30), which is attached to a tool attachment surface (32) of a robot (1), with respect to the tool attachment surface (32) includes: a camera (4) attached to the arm tip portion of the robot (1); a touch-up point (an origin of ?m) disposed in a working space of the robot; a measurement section (11a) for measuring the position of the touch-up point by using the robot and the camera; a first storage section (12a) for storing the measured position of the touch-up point; a second storage section (12b) for storing a position of the robot (1) when the tool center point is aligned with the touch-up point by moving the robot; and a calculation section (11b) for calculating the position of the tool center point with respect to the tool attachment surface of the robot by using the stored positions of the touch-up point and the robot.
    Type: Application
    Filed: February 5, 2014
    Publication date: June 5, 2014
    Applicant: FANUC LTD
    Inventors: Kazunori BAN, Katsutoshi TAKIZAWA, Gang SHEN
  • Publication number: 20140156069
    Abstract: A robotic system that includes a mobile robot linked to a plurality of remote stations. One of the remote stations includes an arbitrator that controls access to the robot. Each remote station may be assigned a priority that is used by the arbitrator to determine which station has access to the robot. The arbitrator may include notification and call back mechanisms for sending messages relating to an access request and a granting of access for a remote station.
    Type: Application
    Filed: February 7, 2014
    Publication date: June 5, 2014
    Applicant: Intouch Technologies, Inc.
    Inventors: Yulun Wang, Charles S. Jordan, Keith Phillip Laby, Jonathan Southard
  • Publication number: 20140148951
    Abstract: A manipulator device has an arm portion and a hand portion The hand portion includes one or more finger portions that manipulate a target object. Each finger portion includes a slip sensor and multiple contact sensors, with at least one contact sensor at a position proximate to the slip sensor and at least another contact sensor at a position remote from the slip sensor. When the contact sensors at the positions remote from the slip sensor detect contact of the target object and the contact sensors arranged at the positions proximate to the slip sensors do not detect contact, a position of the finger portion is moved by a distance corresponding to the distance between the contact sensors detecting contact of the target object and the contact sensors arranged at the positions proximate to the slip such that a detecting position of the slip sensor is coincident with a position of the target object.
    Type: Application
    Filed: April 18, 2012
    Publication date: May 29, 2014
    Inventor: Makoto Saen
  • Publication number: 20140147016
    Abstract: A method for determining a spray position for a spray tool includes accessing an image signal generated by a camera, the image signal corresponding to at least an udder of a dairy livestock. The method further includes processing the accessed image signal to determine a tangent to the rear and a tangent to the bottom of the udder of the dairy livestock. The method concludes by determining a spray position from which a spray tool may apply disinfectant to the teats of the dairy livestock, wherein the spray position is a position relative to the intersection of the two tangents.
    Type: Application
    Filed: January 31, 2014
    Publication date: May 29, 2014
    Applicant: Technologies Holdings Corp.
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Patent number: 8731720
    Abstract: Systems and methods as described for providing visual telepresence to an operator of a remotely controlled robot. The robot includes both video cameras and pose sensors. The system can also comprise a head-tracking sensor to monitor the orientation of the operator's head. These signals can be used to aim the video cameras. The controller receives both the video signals and the pose sensor signals from the robot, and optionally receives head-tracking signals from the head-tracking sensor. The controller stitches together the various video signals to form a composite video signal that maps to a robot view. The controller renders an image to a display from that portion of the composite video signal that maps to an operator view. The relationship of the operator view to the robot view is varied according to the signals from the pose sensors and the head-tracking sensor.
    Type: Grant
    Filed: June 10, 2013
    Date of Patent: May 20, 2014
    Assignee: Anybots 2.0, Inc.
    Inventors: Trevor Blackwell, Daniel Casner, Scott Wiley
  • Publication number: 20140135989
    Abstract: An industrial robot system includes an end effector connectable to a robot arm, a drive assembly, and a controller. The end effector includes a distal housing, a spindle assembly rotatable about a rotational axis, a drill bit rotatable about the rotational axis, and a sensor assembly. The sensor assembly includes a first light source, a second light source, and a photosensitive array. The first light source produces a first fan of light which is projected as a first line of light on the object surface. The second light source produces a second fan of light, which is projected as a second line of light on the object surface. The photosensitive array detects a first reflection line corresponding to the first line of light and a second reflection line corresponding to the second line of light.
    Type: Application
    Filed: November 9, 2012
    Publication date: May 15, 2014
    Applicants: ZAGAR INC., RECOGNITION ROBOTICS, INC.
    Inventors: Simon Melikian, Jeremy Hughes, Brian Zagar
  • Publication number: 20140132741
    Abstract: A multiple camera video system and methods for operating such a system. The system may include a plurality of cameras located around a stadium, athletic playing field or other location. The cameras are remotely controlled in a master-slave configuration. A camera operator at a master pan head selects one of the plurality of cameras as the current master camera and utilizes the master pan head to adjust the telemetry and zoom of the master camera to follow the target object. The telemetry and zoom parameters of the master camera are then used to calculate corresponding telemetry, zoom and/or other parameters for each of the plurality of slave cameras. Video captured by each of the cameras is stored for the production of replay video feeds or for archiving. The replays may be capable of “spinning” through the video feeds of adjacent cameras in order for the viewer to get the sensation of revolving around the target object. The multiple camera video system also includes methods for calibrating the system.
    Type: Application
    Filed: January 16, 2014
    Publication date: May 15, 2014
    Inventors: Kenneth Joseph Aagaard, Larry Barbatsoulis, Frank Trizano, Craig Matthew Farrell
  • Patent number: 8718821
    Abstract: A navigational control system for altering movement activity of a robotic device operating in a defined working area, comprising a transmitting subsystem integrated in combination with the robotic device, the transmitting subsystem comprising a mechanical sweeping transmitter laser integrated in combination with a high point of a housing infrastructure of the robotic device so that none of the structural features of the robotic device interfere with sweeping of the transmitting element of the mechanical sweeping transmitter laser.
    Type: Grant
    Filed: August 14, 2013
    Date of Patent: May 6, 2014
    Assignee: iRobot Corporation
    Inventors: Mark J. Chiappetta, Joseph L. Jones
  • Publication number: 20140114482
    Abstract: Devices, systems, and methods for inspecting and objectively analyzing the condition of a roof are presented. A vehicle adapted for traversing and inspecting an irregular terrain includes a chassis having a bottom surface that defines a higher ground clearance at an intermediate location, thereby keeping the center of mass low when crossing roof peaks. In another embodiment, the drive tracks include a partially collapsible treads made of resilient foam. A system for inspecting a roof includes a lift system and a remote computer for analyzing data. Vehicles and systems may gather and analyze data, and generate revenue by providing data, analysis, and reports for a fee to interested parties.
    Type: Application
    Filed: December 30, 2013
    Publication date: April 24, 2014
    Applicant: TOBOR TECHNOLOGY, LLC
    Inventors: MICHAEL D. SLAWINSKI, DENNIS L. GUTHRIE
  • Publication number: 20140114525
    Abstract: The present invention discloses a system for confining the movement of a robot such that certain area(s) are temporarily or permanently excluded from its working territory. The system uses light-absorbing, black-out paper stripe(s) capable of a complete absorption of light including infrared. The system also uses a mobile robot equipped with infrared emitters and infrared-reflection detectors that communicate the existence of infrared reflection or the lack of it to the robot's control unit. The control unit controls the wheel drivers and ensures that the robot continues travelling only as long as the reflection of infrared signals are detected from the surface onto which it is headed. In this system therefore, the said black-out papers act as a fence that the robot cannot pass because they prevent the reflection of infrared light from the spot(s) where they are placed and force the robot to stop and change directions when it encounters a black-out stripe.
    Type: Application
    Filed: October 20, 2012
    Publication date: April 24, 2014
    Inventor: Ali Ebrahimi Afrouzi
  • Patent number: 8698889
    Abstract: A metrology system has an elongate stationary camera pixel array facing a workpiece transit path of a robot with an field of view corresponding to a workpiece diameter and extending transverse to the transit path portion, and a stationary elongate light emitting array generally parallel to the pixel array. An image control processor causes the camera to capture successive image frames while the robot is moving the workpiece through the transit path.
    Type: Grant
    Filed: February 17, 2010
    Date of Patent: April 15, 2014
    Assignee: Applied Materials, Inc.
    Inventors: Abraham Ravid, Todd Egan, Karen Lingel, Mitchell DiSanto, Hari Kishore Ambal, Edward Budiarto
  • Patent number: 8693729
    Abstract: The present invention creates and stores target representations in several coordinate representations based on biologically inspired models of the human vision system. By using biologically inspired target representations a computer can be programmed for robot control without using kinematics to relate a target position in camera eyes to a target position in body or head coordinates. The robot sensors and appendages are open loop controlled to focus on the target. In addition, the invention herein teaches a scenario and method to learn the mappings between coordinate representations using existing machine learning techniques such as Locally Weighted Projection Regression.
    Type: Grant
    Filed: August 14, 2012
    Date of Patent: April 8, 2014
    Assignee: HRL Laboratories, LLC
    Inventors: Paul Alex Dow, Deepak Khosla, David J Huber
  • Publication number: 20140092268
    Abstract: A setting of a video camera is remotely controlled. Video from a video camera is displayed to a user using a video display. At least one eye of the user is imaged as the user is observing the video display, a change in an image of at least one eye of the user is measured over time, and an eye/head activity variable is calculated from the measured change in the image using an eyetracker. The eye/head activity variable is translated into a camera control setting, and an actuator connected to the video camera is instructed to apply the camera control setting to the video camera using a processor.
    Type: Application
    Filed: December 11, 2013
    Publication date: April 3, 2014
    Applicant: LC TECHNOLOGIES, INC.
    Inventor: Dixon Cleveland
  • Patent number: 8686679
    Abstract: A method of confining a robot in a work space includes providing a portable barrier signal transmitting device including a primary emitter emitting a confinement beam primarily along an axis defining a directed barrier. A mobile robot including a detector, a drive motor and a control unit controlling the drive motor is caused to avoid the directed barrier upon detection by the detector on the robot. The detector on the robot has an omnidirectional field of view parallel to the plane of movement of the robot. The detector receives confinement light beams substantially in a plane at the height of the field of view while blocking or rejecting confinement light beams substantially above or substantially below the plane at the height of the field of view.
    Type: Grant
    Filed: December 14, 2012
    Date of Patent: April 1, 2014
    Assignee: iRobot Corporation
    Inventors: Joseph L. Jones, Philip R. Mass
  • Patent number: 8688261
    Abstract: Disclosed are a transport apparatus that holds and transports an object on a predetermined transport track using a transport portion provided at the leading end of an arm and is capable of acquiring the teaching information of a transport position using a normal transport operation, a position teaching method, and a sensor jig. A transmissive sensor (32) is provided in a sensor jig (30) such that the projection segments of an optical axis (41) and an optical axis (42) on a projection plane intersect with each other and neither the project segment of the optical axis (41) nor the projection segment of the optical axis (42) is aligned with the X-direction and the Y-direction. During a position teaching operation, the sensor jig (30) is provided so as to be held by a wafer transport portion (24), thereby detecting target members (51, 52).
    Type: Grant
    Filed: May 19, 2009
    Date of Patent: April 1, 2014
    Assignee: Rorze Corporation
    Inventor: Kenji Hirota
  • Publication number: 20140088410
    Abstract: Systems and methods that utilize optical sensors and non-optical sensors to determine the position and/or orientation of objects. A navigation system includes an optical sensor for receiving optical signals from markers and a non-optical sensor, such as a gyroscope, for generating non-optical data. A navigation computer determines positions and/or orientations of objects based on optical and non-optical data.
    Type: Application
    Filed: September 24, 2013
    Publication date: March 27, 2014
    Inventor: Chunwu Wu
  • Patent number: 8682482
    Abstract: The working support robot system of the present invention includes: a robot arm (11); a measuring unit (12) for measuring the worker's position; a work progress estimation unit (13) for estimating the work progress based on data input from the measuring unit (12) while referring to data on work procedure, and for selecting objects necessary for the next task when the work is found to have advanced to the next procedure; and an arm motion planning unit (14) for planning the trajectory of the robot arm (11) to control the robot arm (11) based on the work progress estimated by the work progress estimation unit (13) and selected objects. The working support robot system can deliver objects such as tools and parts to the worker according to the work to be performed by the worker.
    Type: Grant
    Filed: May 21, 2010
    Date of Patent: March 25, 2014
    Assignees: Toyota Motor East Japan, Inc., Tohoku University
    Inventors: Kazuhiro Kosuge, Yusuke Sugahara, Jun Kinugawa, Yuta Kawaai, Akiyoshi Ito, Yoichi Matsui, Shinji Kawabe
  • Publication number: 20140075754
    Abstract: A method for manufacturing an aircraft component according to one embodiment of this disclosure includes providing a machining system including a controller, at least one sensor, and a tool for machining. The method further includes providing an aircraft component, and machining the aircraft component with the tool based on feedback from the at least one sensor.
    Type: Application
    Filed: December 3, 2012
    Publication date: March 20, 2014
    Applicant: UNITED TECHNOLOGIES CORPORATION
    Inventors: Alan C. Barron, Mark F. Zelesky, Charles A. Blizzard, Gregory A. Gilbert, James Masloski, Allan J. Brockett, Jeffrey P. Smith, Bartolomeo Palmieri, Aleah J. Edwards
  • Patent number: 8676379
    Abstract: A control device (1) for a robot arm (8) which, if a person approach detection unit (3) detects approach of a person, performs control according to an impact between the robot arm (8) and the person. The control performed according to the impact is performed through an impact countermeasure motion control unit (4) and by setting individual mechanical impedances for respective joint portions of the robot arm (8) based on a movement of the person detected by a human movement detection unit (2).
    Type: Grant
    Filed: June 28, 2007
    Date of Patent: March 18, 2014
    Assignee: Panasonic Corporation
    Inventor: Yasunao Okazaki
  • Publication number: 20140067126
    Abstract: There is provided with an information processing apparatus. An image including a target object is acquired. A coarse position and orientation of the target object is acquired. Information of a plurality of models which indicate a shape of the target object with different accuracy is held. A geometrical feature of the target object in the acquired image is associated with a geometrical feature indicated by at least one of the plurality of models placed at the coarse position and orientation. A position and orientation of the target object is estimated based on the result of association.
    Type: Application
    Filed: August 21, 2013
    Publication date: March 6, 2014
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Daisuke Watanabe, Kazuhiko Kobayashi
  • Patent number: 8662215
    Abstract: Configurations are provided for vehicular robots or other vehicles to provide shifting of their centers of gravity for enhanced obstacle navigation. Various head and neck morphologies are provided to allow positioning for various poses such as a stowed pose, observation poses, and inspection poses. Neck extension and actuator module designs are provided to implement various head and neck morphologies. Robot control network circuitry is also provided.
    Type: Grant
    Filed: December 3, 2012
    Date of Patent: March 4, 2014
    Assignee: iRobot Corporation
    Inventors: Timothy R. Ohm, Michael Bassett
  • Patent number: 8666552
    Abstract: Method and device for the removal of a part of a crop, such as a leaf (14). To this end, the crop is approached from a low position with vision techniques and the stem (12) and the parts protruding therefrom are observed from beneath. Based upon the number of images observed, an arm is controlled and moved towards the relevant stalk (13). This movement is primarily parallel to the stalk and is performed from a low proximity position. When the stalk (13) is approached, the stalk (13) is positioned within an opening between two rotating parts (3). The stalk is grasped by way of rotation and the stalk is moved in respect of the arm so that the cutting point of the stalk is manipulated towards the arm. The stalk (13) is subsequently cut through and the leaf is disposed of.
    Type: Grant
    Filed: April 14, 2009
    Date of Patent: March 4, 2014
    Inventor: Ronald Zeelen
  • Patent number: 8666551
    Abstract: A semiconductor wafer manufacturing apparatus is equipped with a diagnostic module for diagnosing integrity of a transfer robot. The diagnostic module is attached to one side of a semiconductor wafer transfer chamber provided with the transfer robot, which side is also used for the purpose of maintenance, for example. One or more sensors are installed in the diagnostic module so that when the transfer robot is inserted into the diagnostic module, the position or shape of each end effector of the transfer robot is detected and compared against a pre-registered normal condition, thereby diagnosing the integrity of the end effector of the transfer robot, while performing wafer processing.
    Type: Grant
    Filed: December 22, 2008
    Date of Patent: March 4, 2014
    Assignee: ASM Japan K.K.
    Inventors: Masahiro Takizawa, Teruhide Nishino
  • Publication number: 20140055597
    Abstract: A remote attaching and detaching device is provided, which includes a camera module to substitute a worker's vision in such an environment where the worker's direct access to the in-cell is extremely limited or impossible. In a remote attaching and detaching device including a camera module for monitoring operating environment of a slave arm of a remote robot working in hazardous environment, the remote attaching and detaching device includes a fixing module fixed to a proximity of a gripper provided on the slave arm, and the camera module is attachable to and detachable from the fixing module, and provides visual in-cell information or situation about surroundings of the gripper.
    Type: Application
    Filed: August 26, 2013
    Publication date: February 27, 2014
    Applicants: KOREA HYDRO AND NUCLEAR POWER CO., LTD, KOREA ATOMIC ENERGY RESEARCH INSTITUTE
    Inventors: Kiho KIM, Jong Kwang Lee, Seung-Nam Yu, Byung Suk Park, Il-Je Cho, Hansoo Lee
  • Publication number: 20140058564
    Abstract: Methods of and a system for providing a visual representation of force information in a robotic surgical system. A real position of a surgical end effector is determined. A projected position of the surgical end effector if no force were applied against the end effector is also determined. Images representing the real and projected positions are output superimposed on a display. The offset between the two images provides a visual indication of a force applied to the end effector or to the kinematic chain that supports the end effector. In addition, tissue deformation information is determined and displayed.
    Type: Application
    Filed: October 30, 2013
    Publication date: February 27, 2014
    Applicant: Intuitive Surgical Operations, Inc.
    Inventors: Wenyi Zhao, Tao Zhao, David Q. Larkin
  • Patent number: 8660693
    Abstract: Provided is a technique that enables a robot to be remotely controlled (by a server) and enables a robot component to access an external component (a component of a server) in order for cooperation of heterogeneous robots operating on the basis of different component models. A component integration apparatus for collaboration of a heterogeneous robot according to an embodiment of the present invention comprises: a standard interface unit that provides a common standard interface for controlling components that control the individual functions of the robot; an adapter component that transmits commands to enable external components to call the components through the standard interface unit; and a proxy component that transmits commands to enable the components to call the external components through the standard interface unit.
    Type: Grant
    Filed: December 15, 2010
    Date of Patent: February 25, 2014
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Young-Ho Suh, Kang-Woo Lee, Hyun Kim
  • Publication number: 20140049616
    Abstract: The present invention determines the dimensions and volume of an object by using a novel 3-D camera that measures the distance to every reflective point in its field of view with a single pulse of light. The distance is computed by the time of flight of the pulse to each camera pixel. The accuracy of the measurement is augmented by capture of the laser pulse shape in each camera pixel. The camera can be used on an assembly line to develop quality control data for manufactured objects or on a moving or stationary system that weighs as well as dimensions the objects. The device can also ascertain the minimum size of a box required to enclose an object.
    Type: Application
    Filed: October 28, 2013
    Publication date: February 20, 2014
    Inventor: Roger STETTNER
  • Patent number: 8651046
    Abstract: A method and apparatus for applying sealant. The apparatus may comprise a sealant flow control system. The sealant flow control system may be configured to engage a nozzle of a sealant container to reduce a flow of sealant from the nozzle.
    Type: Grant
    Filed: July 23, 2010
    Date of Patent: February 18, 2014
    Assignee: The Boeing Company
    Inventors: Angelica Davancens, Branko Sarh
  • Publication number: 20140041578
    Abstract: A modular tire spraying system includes a downdraft spray booth for receiving a tire, a fluid delivery system disposed in the spray booth, a robot for transporting the tire to the spray booth, and a platform on which each of the spray booth, the fluid delivery system, and the robot is disposed. The fluid delivery system includes at least one spray gun for delivering a coating to the tire.
    Type: Application
    Filed: October 23, 2013
    Publication date: February 13, 2014
    Inventor: Todd E. Hendricks, SR.
  • Publication number: 20140046341
    Abstract: A camera system that can be utilized in robotic surgery is presented. In particular, a method of setting a light level in a camera in a robotic system includes determining a location of at least one instrument end effectors within a field-of-view of the camera; determining a region-of-interest in the field-of-view based on the location of the at least one instrument tip; gathering luminance statistics in the region-of-interest; computing a luminance value from the luminance statistics; and adjusting an exposure in response to a comparison of the luminance value with a target luminance value.
    Type: Application
    Filed: August 7, 2013
    Publication date: February 13, 2014
    Inventor: Jeffrey DiCarlo
  • Publication number: 20140037146
    Abstract: A structured light pattern including a set of patterns in a sequence is generated by initializing a base pattern. The base pattern includes a sequence of colored stripes such that each subsequence of the colored stripes is unique for a particular size of the subsequence. The base pattern is shifted hierarchically, spatially and temporally a predetermined number of times to generate the set of patterns, wherein each pattern is different spatially and temporally. A unique location of each pixel in a set of images acquired of a scene is determined, while projecting the set of patterns onto the scene, wherein there is one image for each pattern.
    Type: Application
    Filed: July 31, 2012
    Publication date: February 6, 2014
    Inventors: Yuichi Taguchi, Amit Agrawal, Oncel Tuzel
  • Publication number: 20140039680
    Abstract: A mobile robot that includes a robot body, a drive system having one or more wheels supporting the robot body to maneuver the robot across a floor surface, and a riser having a proximal end and a distal end. The proximal end of the riser disposed on the robot body. The robot also includes a head disposed on the distal end of the riser. The head includes a display and a camera disposed adjacent the display.
    Type: Application
    Filed: October 2, 2013
    Publication date: February 6, 2014
    Applicant: iRobot Corporation
    Inventors: Colin Angle, Clara Vu, Matthew Cross, Tony L. Campbell
  • Publication number: 20140039679
    Abstract: An article take-out apparatus including, acquiring a reference container image including an open end face of a container by imaging operation by an camera, setting an image search region corresponding to a storage space of the container based on the reference container image, setting a reference plane including the open end face of the container, calculating a search region corresponding to the image search region based on a calibration data of the camera stored in advance, converting the search region to a converted search region, taking out 3D points included in the converted search region by projecting a plurality of 3D points measured by the 3D measuring device on the reference plane, and recognizing positions of articles inside the container using the 3D points.
    Type: Application
    Filed: July 22, 2013
    Publication date: February 6, 2014
    Applicant: FANUC CORPORATION
    Inventor: Toshiyuki ANDO
  • Publication number: 20130345874
    Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.
    Type: Application
    Filed: September 17, 2012
    Publication date: December 26, 2013
    Applicant: Rethink Robotics, Inc.
    Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
  • Publication number: 20130346348
    Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.
    Type: Application
    Filed: September 17, 2012
    Publication date: December 26, 2013
    Applicant: Rethink Robotics, Inc.
    Inventors: Christopher J. Buehler, Michael Siracusa
  • Publication number: 20130345872
    Abstract: In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions.
    Type: Application
    Filed: September 17, 2012
    Publication date: December 26, 2013
    Applicant: Rethink Robotics, Inc.
    Inventors: Rodney Brooks, Bruce Blumberg, Noelle Dye, Paula Long
  • Publication number: 20130345870
    Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.
    Type: Application
    Filed: September 17, 2012
    Publication date: December 26, 2013
    Applicant: Rethink Robotics, Inc.
    Inventors: Christopher J. Buehler, Michael R. Siracusa
  • Publication number: 20130345875
    Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.
    Type: Application
    Filed: September 17, 2012
    Publication date: December 26, 2013
    Applicant: Rethink Robotics, Inc.
    Inventors: Rodney Brooks, Christopher J. Buehler, Matthew DiCicco, Gerry Ens, Albert Huang, Michael Siracusa, Matthew M. Williamson
  • Publication number: 20130332018
    Abstract: A road guidance system for the blind. A plurality of RFID tags 10 are buried inside a pedestrian pavement 100 with information of said pavement 100 contained therein, along the pavement. A guide robot 20 has an RFID reader 21 for checking signals of said RFID tags 10 from said pavement 100 by passing therethrough, and an obstruction sensor 22 for sensing an obstruction. Wheels 23 are positioned on both sides of the guide robot. A controller 30 checks signals from said RFID reader 21 and said obstruction sensor 22 inside said guide robot 20, converts the checked information into voice signals and outputs the voice signals, and then controls the driving of said guide robot 20. A stick 40 is electrically connected to enable communication with said guide robot 20, thereby enabling safe loading and unloading.
    Type: Application
    Filed: January 16, 2012
    Publication date: December 12, 2013
    Inventor: Ji Hun Kim
  • Publication number: 20130331987
    Abstract: The invention is related to methods and apparatus that use a visual sensor and dead reckoning sensors to process Simultaneous Localization and Mapping (SLAM). These techniques can be used in robot navigation. Advantageously, such visual techniques can be used to autonomously generate and update a map. Unlike with laser rangefinders, the visual techniques are economically practical in a wide range of applications and can be used in relatively dynamic environments, such as environments in which people move. One embodiment further advantageously uses multiple particles to maintain multiple hypotheses with respect to localization and mapping. Further advantageously, one embodiment maintains the particles in a relatively computationally-efficient manner, thereby permitting the SLAM processes to be performed in software using relatively inexpensive microprocessor-based computer systems.
    Type: Application
    Filed: August 12, 2013
    Publication date: December 12, 2013
    Applicant: iRobot Corporation
    Inventors: L. Niklas Karlsson, Paolo Pirjanian, Luis Filipe Domingues Goncalves, Enrico Di Bernardo
  • Publication number: 20130325169
    Abstract: The present invention relates to an apparatus for transferring a glove (100) from a conveyor (200) characterised by: a camera (10); a pick-up assembly (20) comprising a pair of inner grippers (21) and two pairs of outer grippers (22); a pair of sensors (30); a robotic arm (40) mounted with the pick-up assembly (20); a processor.
    Type: Application
    Filed: May 20, 2013
    Publication date: December 5, 2013
    Applicant: Pentamaster Engineering SDN BHD
    Inventor: Choon Bin Chuah
  • Publication number: 20130325181
    Abstract: The systems and methods are directed to mechanical arms and manipulators, and more particularly, to optical distance sensors in use for approach, grasping and manipulation. The system may include a manipulator having an arm and a multi fingered end-effector coupled to the distal end of the arm. The end-effector may include an optical proximity sensor configured to detect the distance to an object prior to contact with the object. The end-effector may include an optical proximity sensor configured detect a measurement of force applied to the object by the manipulator post contact with the object. The measurement of force may be a range of force measurements including a minimum, a maximum and a measurement between or within the minimum and the maximum.
    Type: Application
    Filed: May 31, 2012
    Publication date: December 5, 2013
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventor: Douglas A. Moore
  • Publication number: 20130325179
    Abstract: A system and method for adjusting the position and orientation of a feed arm associated with a wafer handling robot. In one embodiment, the system includes a positioning plate detachably carried by the feed arm and insertable therewith into a wafer carrier. The positioning plate includes graphic alignment indicia. An alignment apparatus is provided configured for insertion into wafer-holding slots in the wafer carrier. The apparatus includes at least one digital image sensor. With the positioning plate and alignment apparatus located in the wafer carrier, an image of the alignment indicia is displayed on a video monitor by the image sensor for comparison to a reference mark superimposed on the monitor for determining the relative position and orientation of the feed arm with respect to the wafer carrier. Some embodiments of the apparatus further include a distance detection device to measure the distance to the plate.
    Type: Application
    Filed: June 1, 2012
    Publication date: December 5, 2013
    Applicant: TAIWAN SEMICONDUCTOR MANUFACTURING CO., LTD.
    Inventors: Wen-Huang LIAO, Hsien-Mao HUANG
  • Patent number: 8593451
    Abstract: A method of generating a 3D complex octree map. A plurality of points each having 3D location information are detected from a range image. A space having the detected plurality of points is represented using grids. If points in a grid forms a plane, the grid is not subdivided and planar information about the plane is stored. A space not forming a plane is subdivided, thereby enhancing the storage efficiency.
    Type: Grant
    Filed: February 8, 2011
    Date of Patent: November 26, 2013
    Assignees: Samsung Electronics Co., Ltd., Korea University Research and Business Foundation
    Inventors: Yeon-Ho Kim, Joon-Kee Cho, Jung-Hyun Han, Yonh-hyun Jo