Optical Patents (Class 901/47)
  • Patent number: 8082064
    Abstract: A robotic arm and control system includes a robotic arm which moves in response to one or more command signals. One or more “active” fiducials are located on the arm, each of which emits its own light. A 3D camera having an associated field-of-view is positioned such that at least one fiducial and a target object to be manipulated are in the FOV. To determine their spatial positions, the arm fiducials are activated and the target object is preferably illuminated with a scanning laser; the camera produces output signals which vary with the spatial locations of the fiducials and target object. A controller receives the output signals and uses the spatial position information as feedback to continuously guide the arm towards the target object. Multiple active fiducials may be employed, each having respective characteristics with which they can be differentiated.
    Type: Grant
    Filed: November 13, 2007
    Date of Patent: December 20, 2011
    Assignee: Elite Engineering Corporation
    Inventor: Robert L. Kay
  • Patent number: 8079432
    Abstract: Configurations are provided for vehicular robots or other vehicles to provide shifting of their centers of gravity for enhanced obstacle navigation. Various head and neck morphologies are provided to allow positioning for various poses such as a stowed pose, observation poses, and inspection poses. Neck extension and actuator module designs are provided to implement various head and neck morphologies. Robot control network circuitry is also provided.
    Type: Grant
    Filed: January 5, 2010
    Date of Patent: December 20, 2011
    Assignee: iRobot Corporation
    Inventors: Timothy R. Ohm, Michael Bassett
  • Publication number: 20110301745
    Abstract: A system and associated method is provided for sorting parts, which includes a conveyor system for receiving and circulating a plurality of randomly presented parts, a sorting buffer for accumulating selected parts from the plurality of randomly presented parts in an assigned buffer location, and a sequencing system for sequencing the accumulated selected parts.
    Type: Application
    Filed: August 12, 2011
    Publication date: December 8, 2011
    Inventors: James Clete Culp, Stanley E. Sankaran
  • Publication number: 20110299966
    Abstract: A retrofittable lifting apparatus for use on an articulated robotically controlled device is disclosed. The lifting apparatus includes a structure for mating the apparatus to the robotic device and a mechanism for the remotely controlled lifting of an ordnance or other object. Actuation of the lifting mechanism may be provided by existing sources of motion on the robotic device, or by additional sources of motion attached to the robotic device.
    Type: Application
    Filed: August 15, 2011
    Publication date: December 8, 2011
    Inventors: Grinnell More, Tyson Sawyer
  • Publication number: 20110301758
    Abstract: [SUMMARY] [OBJECT] Provide a method for controlling a robot arm which retrains the vibration of the arm during a switch of operating method from teaching play back control to feedback control. [SOLUTION] Operate robot arm by the control method comprising the following steps and the vibration of the robot arm is restrained at the time of the control change, by using a non-contact type impedance control method; A step to move a robot arm along a course decided beforehand, the step is performed with teaching play hack control, the teaching play back control is carried out by the instruction of a program which is stored in the control department of a control unit. A step to recognize the presence or absence of the work by a work recognition means which is provided to the arm.
    Type: Application
    Filed: October 7, 2009
    Publication date: December 8, 2011
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Ryo Nakajima, Gentoku Fujii
  • Publication number: 20110301759
    Abstract: A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and microphone of the robot, respectively. The remote station may include a display user interface that has a variety of viewable fields and selectable buttons.
    Type: Application
    Filed: August 1, 2011
    Publication date: December 8, 2011
    Inventors: Yulun Wang, Charles S. Jordan, Jonathan Southard, Marco Pinter
  • Publication number: 20110288964
    Abstract: In exemplary implementations of this invention, an input/output device (“Bulb”) is attached to on an articulated, actuated robotic arm. The robotic arm can move the Bulb by translating it along three axes and by rotating it about the arm's base. In addition, the I/O can rotate about its own vertical axis. The Bulb comprises at least a pico-projector, two cameras, a depth sensor and an onboard computer. The onboard computer controls actuators in the robotic arm and Bulb that move the Bulb. It also processes visual data captured by the cameras in order to recognize objects or events, and to respond to them. This response may include changing the position of the Bulb or changing the parameters of an image projected by the pico-projector.
    Type: Application
    Filed: May 24, 2011
    Publication date: November 24, 2011
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Natan Linder, Patricia Maes
  • Publication number: 20110288667
    Abstract: Provided is an industrial robot system which enables a reduction in an installation/adjustment period, and an increase in a no-error continuous operation period, and includes an action planning section (4) for temporary halts, an error-inducing-task restraining section (5), a section (6) for teaching task, an operation mastering section (7), a hand library (8), an optimum-task-operation generating section (11), a specific task library (9), an error-recovery-task teaching section (12), an error recovery library (10), a finger-eye-camera measurement section (32), a three-dimensional recognition section (33), a controller (30), a manipulator (31), and a universal hand contained in a manipulation device group (34).
    Type: Application
    Filed: February 10, 2010
    Publication date: November 24, 2011
    Applicants: KYOTO UNIVERSITY, MITSUBISHI ELECTRIC CORPORATION
    Inventors: Akio Noda, Haruhisa Okuda, Kenichi Tanaka, Tetsuo Sawaragi, Hiroshi Matsuhisa, Yasuyoshi Yokokouji, Hideo Utsuno, Masaharu Komori, Hajime Mizuyama, Hiroaki Nakanishi, Yukio Horiguchi, Takehisa Kouda, Kazuhiro Izui
  • Publication number: 20110286800
    Abstract: A method and apparatus for deployment and retrieval of ocean bottom seismic receivers. In one embodiment, the apparatus comprises a carrier containing a plurality of receivers attached to a remotely operated vehicle (ROV). The carrier comprises a frame in which is mounted a structure for seating and releasing said receivers. The structure may comprise a movable carousel or a movable conveyor or fixed parallel rails or a barrel. In the case of the barrel, the receivers are axially stacked therein. The structure is disposed to deliver said receivers to a discharge port on said frame, where the receivers are removable from said carrier. The apparatus includes a discharge mechanism for removing said receivers from said carrier.
    Type: Application
    Filed: August 1, 2011
    Publication date: November 24, 2011
    Applicant: Fairfield Industries Incorporated
    Inventors: James N. Thompson, Clifford H. Ray, Glenn D. Fisseler, Roger L. Fyffe
  • Publication number: 20110288684
    Abstract: A robot system includes a mobile robot having a controller executing a control system for controlling operation of the robot, a cloud computing service in communication with the controller of the robot, and a remote computing device in communication with the cloud computing service. The remote computing device communicates with the robot through the cloud computing service.
    Type: Application
    Filed: February 22, 2011
    Publication date: November 24, 2011
    Applicant: iRobot Corporation
    Inventors: Tim S. Farlow, Michael Rosenstein, Michael Halloran, Chikyung Won, Steven V. Shamlian, Mark Chiappetta
  • Publication number: 20110288417
    Abstract: A robot system that can move about two or more patient beds. The robot includes a monitor and an infrared camera that are coupled to a mobile platform. The robot also includes a controller that is programmed to autonomously move the mobile platform from one patient to another patient and process images captured by the infrared camera to determine if one or more of the patients needs assistance. By way of example, the robot can determine whether a patient is out of a bed, or in a position wherein they may fall out of the bed. The robot may be coupled to a remote station that allows an operator to move the robot and conduct a videoconference with the patient. The image captured by the infrared robot camera can be utilized to analyze blood flow of the patient. The robot can also be utilized to perform neurological analysis.
    Type: Application
    Filed: May 19, 2011
    Publication date: November 24, 2011
    Applicant: INTOUCH TECHNOLOGIES, INC.
    Inventors: Marco Pinter, Fuji Lai, Yulun Wang, H. Neal Reynolds
  • Publication number: 20110282483
    Abstract: Automated positioning and alignment methods and systems for aircraft structures use anthropomorphous robots with six degrees of freedom to carry the aero structure parts during the positioning and alignment. The parts and structures (if any) supporting the parts are treated as robot tools.
    Type: Application
    Filed: November 12, 2010
    Publication date: November 17, 2011
    Applicants: ITA - Instituto Tecnologico de Aeronautica, EMBRAER - Empresa Brasileira de Aeronautica SA
    Inventors: Marcos Leandro Simonetti, Luis Gonzaga Trabasso
  • Publication number: 20110282492
    Abstract: A method of controlling a robot system includes the steps of providing a tool supported by a moveable mechanism of the robot system, providing a workpiece supported by a holder, generating an image of the workpiece, extracting a data from the image, the data relating to a feature of the workpiece, generating a continuous three-dimensional path along the workpiece using data extracted from the image, and moving the tool along the path.
    Type: Application
    Filed: February 3, 2010
    Publication date: November 17, 2011
    Inventors: Ken Krause, Bruce E. Coldren, Edward Roney, Steven Prehn, Michael M. Sharpe, Claude Dinsmoor
  • Publication number: 20110276179
    Abstract: An imaging platform system that provides integrated navigation capabilities for surgical guidance. The system can include two robotic arm systems, one robotic arm system holding an imaging source, and the other holding an imaging sensor. These robotic arm systems are able to move and provide three-dimensional tomographic scans, static radiographic images, and dynamic fluoroscopic image sequences. A third robotic arm system can be included in the imaging platform system as a surgeon guided tool-holder to accurately implement an image-guided surgical plan. The robotic systems can manipulate imaging and surgical components into and out of the operative field as needed, enhancing the choreography between a surgical team and assistive technology. A handle can be included as part of a manual positioning control subsystem. The handle can be mounted to an imaging robotic system above and/or below an operating table, and also can be mounted to a tool-holding robotic system.
    Type: Application
    Filed: October 14, 2009
    Publication date: November 10, 2011
    Applicant: University of Florida Research Foundation, Inc.
    Inventors: Scott Arthur Banks, Frank J. Bova
  • Patent number: 8055382
    Abstract: Disclosed is a method, apparatus, and medium for estimating a pose of a moving robot using a particle filter.
    Type: Grant
    Filed: September 14, 2007
    Date of Patent: November 8, 2011
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Hyeon Myeong, Seok-won Bang, Dong-geon Kong, Su-jinn Lee
  • Patent number: 8050797
    Abstract: A automation equipment control system comprises a general purpose computer with a general purpose operating system in electronic communication with a real-time computer subsystem. The general purpose computer includes a program execution module to selectively start and stop processing of a program of equipment instructions and to generate a plurality of move commands. The real-time computer subsystem includes a move command data buffer for storing the plurality of move commands, a move module linked to the data buffer for sequentially processing the moves and calculating a required position for a mechanical joint. The real-time computer subsystem also includes a dynamic control algorithm in software communication with the move module to repeatedly calculate a required actuator activation signal from a joint position feedback signal.
    Type: Grant
    Filed: July 26, 2005
    Date of Patent: November 1, 2011
    Assignee: C.H.I. Development Mgnt. Ltd. XXIV, LLC
    Inventor: John R. Lapham
  • Publication number: 20110264266
    Abstract: A robot safety system configured to protect humans in the vicinity of a working robot (1, 11, 21, 31) against harmful impacts by said robot (1, 11, 21, 31), said safety system comprising a sensor system (3, 13, 23) and a safety controller (4, 14, 24) configured to establish an impact risk profile of the robot (1, 11, 21, 31) and deliver an operating signal to a robot controller (2, 12, 22) based on said impact risk profile, wherein the safety controller (4, 14, 24) is configured to establish the impact risk profile based on stored data and input signals, and that the stored data and input signals comprise stored impact data, stored data related to the path of the robot (1, 11, 21, 31), and signals from the sensor system of events in the vicinity of the robot (1, 11, 21, 31), such as a detected human (P1, P11, P21, P22, P31, P32) in the vicinity of the robot (1, 11, 21, 31).
    Type: Application
    Filed: December 3, 2008
    Publication date: October 27, 2011
    Applicant: ABB RESEARCH LTD.
    Inventor: Soenke Kock
  • Publication number: 20110261198
    Abstract: In a data transmission method, a data transfer module 40 for transmitting an image frame of a predetermined image format and an audio frame of a predetermined audio format is used. The data transmission method includes: a step of acquiring image data captured by a pair of cameras 8, together with a time stamp indicating an imaging time of the image data, and generating an image frame; a step of acquiring relevant data acquired at the imaging time, together with the time stamp indicating the imaging time; and a step of acquiring the image frame and the relevant data that have the common time stamp, and transmitting, using the data transfer module 40, a relevant frame together with the image frame converted in the image format, the relevant frame being composed of the relevant data stored in an audio data storage area in the audio format.
    Type: Application
    Filed: April 18, 2011
    Publication date: October 27, 2011
    Applicant: HONDA MOTOR CO., LTD.
    Inventor: Yoko SAITO
  • Publication number: 20110251717
    Abstract: The method for picking up work pieces includes: storing a representative partial shape, at least one of holding positions in the representative partial shape, and a preference rank for each of the holding positions; obtaining an image information by sensing the work pieces which are accumulated; recognizing exposed portions in the work pieces by performing an edge detection on the image information; selecting at least one of the exposed portions including the representative partial shape as a selected portion; detecting at least one of coinciding portions coinciding with the representative partial shape in the selected portions; determining an optimal holding position from at least one of the holding positions included in the coinciding portion as a determined holding position, based on the preference rank; and transmitting a holding command to hold the determined holding position.
    Type: Application
    Filed: September 3, 2009
    Publication date: October 13, 2011
    Applicant: HONDA MOTOR CO., LTD.
    Inventor: Makoto Furukawa
  • Patent number: 8035687
    Abstract: An image processing apparatus includes unit reading, from first objects, information items for determining program procedures used to detect the first objects by image processing, unit selecting, from the program procedures, program procedures corresponding to the information items, and to determine an order of execution of the selected program procedures, unit acquiring an initial image including images corresponding to the first objects, the initial image being used for executing a first program procedure of a first order of execution, which is included in the selected program procedures, unit detecting, using the initial image, at least one of the first objects by executing the first program procedure corresponding to at least one of the first objects, and unit generating a post-removal image obtained by removing, from the initial image, image data corresponding to one of the images, which is on a first area corresponding to at least one detected object.
    Type: Grant
    Filed: July 22, 2008
    Date of Patent: October 11, 2011
    Assignee: Kabushiki Kaisha Toshiba
    Inventor: Manabu Nishiyama
  • Publication number: 20110245973
    Abstract: A robotic system that includes a robot and a remote station. The remote station can generate control commands that are transmitted to the robot through a broadband network. The control commands can be interpreted by the robot to induce action such as robot movement or focusing a robot camera. The robot can generate reporting commands that are transmitted to the remote station through the broadband network. The reporting commands can provide positional feedback or system reports on the robot.
    Type: Application
    Filed: June 18, 2010
    Publication date: October 6, 2011
    Inventors: Yulun Wang, Charles S. Jordan, Marco Pinter, Jonathan Southard
  • Publication number: 20110243701
    Abstract: An optical position detecting device includes a plurality of light source sections which emits detection light, a light detection section which receives the detection light reflected by a target object located in an emitting space of the detection light, a light source driving section which turns on some light source sections among the plurality of light source sections in a first period and turns on, in a second period, light source sections different from the light source sections turned on in the first period, and a position detecting section which detects the position of the target object on the basis of a light-receiving result of the light detection section in the first period and the second period. Each of the light source sections includes a plurality of light-emitting elements arrayed in a direction intersecting the direction of the optical axis of the detection light.
    Type: Application
    Filed: April 1, 2011
    Publication date: October 6, 2011
    Applicant: SEIKO EPSON CORPORATION
    Inventor: Kanechika KIYOSE
  • Publication number: 20110243702
    Abstract: In an optical position detecting device, a position detecting section detects the position of a target object on the basis of a result obtained by receiving detection light, which is emitted from a light source section and reflected by the target object, using a light detection section. As seen from an emitting direction of the detection light, the light detection section is located inside a region surrounded by a closed circuit passing through a plurality of the light source sections or inside a region pinched by the plurality of light source sections. The plurality of light source sections has a first light-emitting element, and a second light-emitting element located closer to the light detection section side than the first light-emitting element. The light source driving section alternately turns on the first light-emitting element and the second light-emitting element.
    Type: Application
    Filed: April 1, 2011
    Publication date: October 6, 2011
    Applicant: SEIKO EPSON CORPORATION
    Inventors: Daisuke NAKANISHI, Kanechika KIYOSE
  • Publication number: 20110245974
    Abstract: There is provided a robot device including an instruction acquisition unit that acquires an order for encouraging a robot device to establish joint attention on a target from a user, a position/posture estimation unit that estimates a position and posture of an optical indication device, which is operated by the user to indicate the target by irradiation of a beam, in response to acquisition of the order, and a target specifying unit that specifies a direction of the target indicated by irradiation of the beam based on an estimation result of the position and posture and specifies the target on an environment map representing a surrounding environment based on a specifying result of the direction.
    Type: Application
    Filed: March 9, 2011
    Publication date: October 6, 2011
    Applicant: SONY CORPORATION
    Inventors: Kenta KAWAMOTO, Satoru Shimizu, Toshimitsu Tsuboi, Yasuhiro Suto
  • Publication number: 20110234758
    Abstract: There is provided a robot device including an irradiation unit that irradiates pattern light to an external environment, an imaging unit that acquires an image by imaging the external environment, an external environment recognition unit that recognizes the external environment, an irradiation determining unit that controls the irradiation unit to be turned on when it is determined that irradiation of the pattern light is necessary based on an acquisition status of the image, and a light-off determining unit that controls the irradiation unit to be turned off when it is determined that irradiation of the pattern light is unnecessary or that irradiation of the pattern light is necessary to be forcibly stopped, based on the external environment.
    Type: Application
    Filed: March 23, 2011
    Publication date: September 29, 2011
    Applicant: Sony Corporation
    Inventors: Toshimitsu TSUBOI, Kenta Kawamoto, Yasunori Kawanami, Atsushi Miyamoto
  • Publication number: 20110224687
    Abstract: A surgical instrument is provided, including: at least one articulatable arm having a distal end, a proximal end, and at least one joint region disposed between the distal and proximal ends; an optical fiber bend sensor provided in the at least one joint region of the at least one articulatable arm; a detection system coupled to the optical fiber bend sensor, said detection system comprising a light source and a light detector for detecting light reflected by or transmitted through the optical fiber bend sensor to determine a position of at least one joint region of the at least one articulatable arm based on the detected light reflected by or transmitted through the optical fiber bend sensor; and a control system comprising a servo controller for effectuating movement of the arm.
    Type: Application
    Filed: March 16, 2011
    Publication date: September 15, 2011
    Applicant: Intuitive Surgical Operations, Inc.
    Inventors: David Q. Larkin, David C. Shafer
  • Publication number: 20110222995
    Abstract: A workpiece in a container is held by a robot based on a result of detection of shape information in the container by a shape sensor, a holding condition of the workpiece held by the robot is inspected by an inspection device, and the workpiece is transferred to a subsequent step by the robot when the inspection device has determined that the holding condition of the workpiece is acceptable. When the inspection device has determined that the holding condition of the workpiece is unacceptable, the held workpiece is placed on a temporary placement table, the shape of the workpiece is again detected by detecting the workpiece using the shape sensor, and the workpiece is held and transferred to the subsequent step by the robot based on a result of the detection.
    Type: Application
    Filed: January 6, 2011
    Publication date: September 15, 2011
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Toshimitsu Irie, Shinji Murai
  • Publication number: 20110224689
    Abstract: A surgical instrument is provided, including: at least one articulatable arm having a distal end, a proximal end, and at least one joint region disposed between the distal and proximal ends; an optical fiber bend sensor provided in the at least one joint region of the at least one articulatable arm; a detection system coupled to the optical fiber bend sensor, said detection system comprising a light source and a light detector for detecting light reflected by or transmitted through the optical fiber bend sensor to determine a position of at least one joint region of the at least one articulatable arm based on the detected light reflected by or transmitted through the optical fiber bend sensor; and a control system comprising a servo controller for effectuating movement of the arm.
    Type: Application
    Filed: March 16, 2011
    Publication date: September 15, 2011
    Applicant: Intuitive Surgical Operations, Inc.
    Inventors: David Q Larkin, David C. Shafer
  • Publication number: 20110218675
    Abstract: A robot system (10) includes: a processing section (32) that calculates positional information of a workpiece with respect to a visual sensor (13); a clock (35) to which a robot controlling section (31) and the processing section access to check a present time; a first storing section (33) that sequentially stores first times at a predetermined cycle in combination with positional information items of an arm tip at the first times, respectively; a second storing section (34) that stores a second time when the visual sensor measures the workpiece; an arm tip position calculating section (41) that calculates positional information of the arm tip when the visual sensor measures the workpiece based on the second time, at least two of the first times before and after the second time among the first times and the positional information items of the arm tip corresponding to the respective first times; and a workpiece position calculating section (42) that calculates positional information of the grasped workpiece wi
    Type: Application
    Filed: January 27, 2011
    Publication date: September 8, 2011
    Applicant: FANUC CORPORATION
    Inventors: Kazunori BAN, Fumikazu Warashina, Makoto Yamada, Yuuta Namiki
  • Patent number: 8010232
    Abstract: A technique to wholly recognize the surrounding environment may be provided by excluding unknown environment which arises due to parts of a body of a robot hindering the sight of the robot during operations. The robot of the present invention is provided with a body trunk including head and torso, at least one connected member that is connected to the body trunk by a joint in which a driving mechanism is provided, a body trunk side camera that is arranged on the body trunk, and a connected member side camera that is arranged on the connected member. Further, the robot is provided with a composite image creation unit that creates composite image of a body trunk side image taken by the body trunk side camera and a connected member side image taken by the connected member side camera, such that a part of the body trunk side image is replaced with a part of the connected member side image so as to exclude the connected member from the body trunk side image.
    Type: Grant
    Filed: February 15, 2007
    Date of Patent: August 30, 2011
    Assignee: Toyota Jidosha Kabushiki Kaisha
    Inventors: Yuichiro Nakajima, Haeyeon Lee, Hideki Nomura
  • Patent number: 8010229
    Abstract: Provided are a method and apparatus for ensuring a cleaning robot to return to a charge station. The method includes the steps of: (a) measuring a battery usable time, a running speed, and an actual return distance of a cleaning robot during a cleaning operation; (b) calculating an allowable return distance on the basis of the battery usable time and the running speed; (c) comparing the actual return distance with the allowable return distance; and (d) returning the cleaning robot to the charge station when the actual return distance is larger than the allowable return distance as a result of the comparison. Therefore, it is possible to prevent the cleaning robot from being not returned to the charge station, thereby providing convenience to a user.
    Type: Grant
    Filed: November 1, 2007
    Date of Patent: August 30, 2011
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Hyung Joo Kim, Chang Gyu Lim, Sung Ho Im, Dong Sun Lim
  • Publication number: 20110208355
    Abstract: Motion information of a robot arm stored in a motion information database is acquired. A person manipulates the robot arm, and correction motion information at the time of the motion correction is acquired. An acquiring unit acquires environment information. A motion correction unit corrects the motion information while the robot arm is in motion. A control rule generating unit generates a control rule for allowing the robot arm to automatically operate based on the corrected motion information and the acquired environment information. The motion of the robot arm is controlled based on the generated control rule.
    Type: Application
    Filed: April 28, 2011
    Publication date: August 25, 2011
    Inventor: Yuko TSUSAKA
  • Publication number: 20110208358
    Abstract: System for maintenance and inspection of structures located in hard to reach places, using a remote controlled arm that consists of arrangement for fixing said remote controlled arm to the structure, said remote controlled arm consists of at least two joints, said remote controlled arm has the ability to change working equipment, said remote controlled arm has a camera, said remote controlled arm is controlled from a control centre.
    Type: Application
    Filed: July 2, 2009
    Publication date: August 25, 2011
    Inventors: Arve Gjelsten, Jon Anders Haegstad, Stale Karlsen, Geir Ingar Bjornsen, Bernt Schjetne, Martin Hasle
  • Publication number: 20110199476
    Abstract: A metrology system has an elongate stationary camera pixel array facing a workpiece transit path of a robot with an field of view corresponding to a workpiece diameter and extending transverse to the transit path portion, and a stationary elongate light emitting array generally parallel to the pixel array. An image control processor causes the camera to capture successive image frames while the, robot is moving the workpiece through the transit path.
    Type: Application
    Filed: February 17, 2010
    Publication date: August 18, 2011
    Applicant: Applied Materials, Inc.
    Inventors: Abraham Ravid, Todd Egan, Karen Lingel, Mitchell DiSanto, Hari Kishore Ambal, Edward Budiarto
  • Publication number: 20110200970
    Abstract: A method of placing a dental implant analog in a physical model for use in creating a dental prosthesis is provided. The physical model, which is usually based on an impression of the patient's mouth or a scan of the patient's mouth, is prepared. The model is scanned. A three-dimensional computer model of the physical model is created and is used to develop the location of the dental implant. A robot then modifies the physical model to create an opening for the implant analog. The robot then places the implant analog within the opening at the location dictated by the three-dimensional computer model.
    Type: Application
    Filed: March 22, 2011
    Publication date: August 18, 2011
    Applicant: Biomet 3i, LLC
    Inventors: Bruce Berckmans, III, Zachary B. Suttin, Dan P. Rogers, T. Tait Robb, Alexis C. Goolik
  • Publication number: 20110196536
    Abstract: The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The line inspection robot includes at least one drive system for propelling the robot along a line, a platform adapted to pivot relative to the at least one drive system, and a control system adapted to control the robot.
    Type: Application
    Filed: February 10, 2011
    Publication date: August 11, 2011
    Applicant: ELECTRIC POWER RESEARCH INSTITUTE, INC.
    Inventors: Andrew John Phillips, Mark Major, Glynn R. Bartlett
  • Publication number: 20110196534
    Abstract: A system for inspecting an underground conduit from within comprises a data acquisition subsystem configured to be placed within the conduit and to move along at least a portion of the conduit to obtain data regarding the conduit. The system comprises a data storage subsystem configured to be placed within the conduit and to move along the conduit. The data storage subsystem receives and stores at least a portion of the data from the data acquisition subsystem for retrieval after the data acquisition subsystem has moved along the conduit.
    Type: Application
    Filed: December 28, 2010
    Publication date: August 11, 2011
    Applicant: Sewervue Technology Corp.
    Inventors: Csaba Ekes, Boriszlav Neducza
  • Publication number: 20110196535
    Abstract: The present invention relates to an overhead transmission line inspection robot and system for inspecting transmission line components and right of way conditions. The overhead transmission line inspection robot includes a communications and control system adapted to control the robot and transmit information and a drive system for propelling the robot along a shield wire to enable inspection over a large area. The robot further includes a camera adapted to inspect right of way and component conditions; a light detection and ranging (LiDar) sensor adapted to measure conductor position, vegetation, and nearby structures; and a global positioning system adapted to identify the robot's position and speed.
    Type: Application
    Filed: February 10, 2011
    Publication date: August 11, 2011
    Applicant: ELECTRIC POWER RESEARCH INSTITUTE, INC.
    Inventors: Andrew John Phillips, J. Mark Major, Glynn R. Bartlett
  • Patent number: 7996114
    Abstract: A workpiece picking device and method for reducing the cycle time of a picking operation of workpieces, by omitting or reducing the movement of a robot manipulator when an image of the workpieces is captured. An image processor of the picking device includes a camera controlling part for reading image data from a camera, a memory for storing the image data, a workpiece detecting part for extracting one or more images from the memory and detecting one or more workpieces in the image, and a workpiece selecting part for selecting a workpiece to be picked among the workpieces detected by the workpiece detecting part. The image processor further includes a stacked condition judging part for determining whether the condition of the stacked workpieces in a container is changed.
    Type: Grant
    Filed: May 25, 2007
    Date of Patent: August 9, 2011
    Assignee: Fanuc Ltd
    Inventors: Kazunori Ban, Keisuke Watanabe
  • Publication number: 20110190936
    Abstract: The present invention relates to a portable power tool (1) comprising a sensor unit adapted to detect a working mark (17, 18) provided on a work piece to be processed using the portable power tool (1), said mark defining a target position of the portable power tool (1), and a control unit for determining a deviation of the actual position from the target position based on the signal output of the sensor unit, wherein a positioning unit is provided that is adapted to make an automatic correction of the portable power tool (1) from the actual position to the target position, and or a signal output unit is provided for issuing signals that represent the determined deviation to a user for the purposes of making manual corrections.
    Type: Application
    Filed: May 28, 2009
    Publication date: August 4, 2011
    Applicant: Robert Bosch GmbH
    Inventors: Thilo Koeder, Joachim Platzer, Ulli Hoffmann, Jan Koegel
  • Publication number: 20110184558
    Abstract: The invention relates to a robot (R, 70) and to a method for controlling a robot (R, 70). The distance (d) between an object (P) and the robot (R, 70) and/or the derivative thereof or a first motion of the object (P) is detected by means of a non-contact distance sensor (20, 20?) arranged in or on a robot arm (M) of the robot (R, 70) and/or on or in an end effector (19?) fastened on the robot arm (M). The robot arm (M) is moved based on the first motion detected by means of the distance sensor (20, 20?), a target force or a target torque to be applied by the robot (R) is determined based on the distance (d) detected between the object (P) and the robot (R), and/or a function of the robot (R) or a parameterization of a function of the robot (R) is triggered based on the first motion detected and/or a target distance between the object (P) and the robot (R) and/or the derivative thereof detected by means of the distance sensor (20, 20?).
    Type: Application
    Filed: August 19, 2009
    Publication date: July 28, 2011
    Applicant: KUKA LABORATORIES GMBH
    Inventors: Dirk Jacob, Tobias Ortmaier
  • Publication number: 20110174108
    Abstract: An elongate robotic arm comprising articulated segments and a channel that extends along the longitudinal axis of the arm and contains a stiffening member which includes a sensor for measuring the shape of the arm. Using the central channel for this purpose improves the ease and accuracy of shape measurement.
    Type: Application
    Filed: December 27, 2010
    Publication date: July 21, 2011
    Inventors: Andrew Crispin Graham, Robert Oliver Buckingham
  • Publication number: 20110172822
    Abstract: A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialogue with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident.
    Type: Application
    Filed: July 23, 2010
    Publication date: July 14, 2011
    Inventors: Andrew Ziegler, Andrew Jones, Clara Vu, Matthew Cross, Ken Sinclair, Tony L. Campbell
  • Publication number: 20110155067
    Abstract: The invention provides a system for connecting a teat cup to a teat, and a teat cup of this type. A teat cup of this type includes an optical sensor which looks from the inside outwards through the opening of the teat cup. An image of, for example, a teat can thus be produced, and said image can be used very simply to assist a robot arm in guiding the teat cup to the teat.
    Type: Application
    Filed: March 11, 2011
    Publication date: June 30, 2011
    Applicant: LELY PATENT N.V.
    Inventor: Karel Van Den Berg
  • Publication number: 20110153077
    Abstract: Provided is a technique that enables a robot to be remotely controlled (by a server) and enables a robot component to access an external component (a component of a server) in order for cooperation of heterogeneous robots operating on the basis of different component models. A component integration apparatus for collaboration of a heterogeneous robot according to an embodiment of the present invention comprises: a standard interface unit that provides a common standard interface for controlling components that control the individual functions of the robot; an adapter component that transmits commands to enable external components to call the components through the standard interface unit; and a proxy component that transmits commands to enable the components to call the external components through the standard interface unit.
    Type: Application
    Filed: December 15, 2010
    Publication date: June 23, 2011
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Young-Ho SUH, Kang-Woo LEE, Hyun KIM
  • Publication number: 20110144805
    Abstract: A navigational control system for altering movement activity of a robotic device operating in a defined working area, comprising a transmitting subsystem integrated in combination with the robotic device, the transmitting subsystem comprising means for emitting a number of directed beams, each directed beam having a predetermined emission pattern, and a receiving subsystem functioning as a base station that includes a navigation control algorithm that defines a predetermined triggering event for the navigational control system and a set of detection units positioned within the defined working area in a known spaced-apart relationship, the set of detection units being configured and operative to detect one or more of the directed beams emitted by the transmitting system; and wherein the receiving subsystem is configured and operative to process the one or more detected directed beams under the control of the navigational control algorithm to determine whether the predetermined triggering event has occurred, an
    Type: Application
    Filed: December 13, 2010
    Publication date: June 16, 2011
    Inventors: Mark J. Chiappetta, Joseph L. Jones
  • Publication number: 20110141251
    Abstract: A set of images is acquired of a scene by a camera. The scene includes a moving object, and a relative difference of a motion of the camera and a motion of the object is substantially zero. Statistical properties of pixels in the images are determined, and a statistical method is applied to the statistical properties to identify pixels corresponding to the object.
    Type: Application
    Filed: December 10, 2009
    Publication date: June 16, 2011
    Inventors: Tim K. Marks, Ashok Veeraraghavan, Yuichi Taguchi
  • Publication number: 20110135189
    Abstract: A plurality of swarm intelligence-based mobile robots, each having multiple legs and multiple joints, the mobile robot includes: an environment recognition sensor for collecting sensed data about the surrounding environment of the mobile robot; a communication unit for performing communication with a remote controller, a parent robot managing at least one mobile robot, or the other mobile robots located within a predefined area; and a control unit for controlling the motions of the multiple legs and multiple joints to control movement of the mobile robot to a given destination based on control data transmitted from the remote controller through the communication unit or based on communication with the other mobile robots within the predefined area or based on the sensed data collected by the environment recognition sensor.
    Type: Application
    Filed: September 1, 2010
    Publication date: June 9, 2011
    Applicant: Electronics and Telecommunications Research Institute
    Inventor: Chang Eun LEE
  • Publication number: 20110127788
    Abstract: An optical detection device includes: a translucent unit that has elasticity; a light source unit that emits detection light toward the translucent unit; a light sensitive unit that is directed toward the translucent unit and has light sensitivity; and a detection unit that detects a target object based on the intensity of light received by the light sensitive unit.
    Type: Application
    Filed: November 30, 2010
    Publication date: June 2, 2011
    Applicant: SEIKO EPSON CORPORATION
    Inventor: Daisuke NAKANISHI
  • Publication number: 20110130864
    Abstract: Disclosed are a transport apparatus that holds and transports an object on a predetermined transport track using a transport portion provided at the leading end of an arm and is capable of acquiring the teaching information of a transport position using a normal transport operation, a position teaching method, and a sensor jig. A transmissive sensor (32) is provided in a sensor jig (30) such that the projection segments of an optical axis (41) and an optical axis (42) on a projection plane intersect with each other and neither the project segment of the optical axis (41) nor the projection segment of the optical axis (42) is aligned with the X-direction and the Y-direction. During a position teaching operation, the sensor jig (30) is provided so as to be held by a wafer transport portion (24), thereby detecting target members (51, 52).
    Type: Application
    Filed: May 19, 2009
    Publication date: June 2, 2011
    Applicant: RORZE CORPORATION
    Inventor: Kenji Hirota