Optical Patents (Class 901/47)
  • Publication number: 20120265345
    Abstract: In this robot system, a control portion is configured to control a robot to grasp an object to be grasped by a grasping portion, and control a first imaging portion to examine the object to be grasped while driving a robot arm to change a posture of the object to be grasped multiple times.
    Type: Application
    Filed: February 16, 2012
    Publication date: October 18, 2012
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Yoshimitsu NAKAHARA
  • Publication number: 20120265344
    Abstract: This robot system includes a first imaging portion detachably mounted to a robot arm and a control portion controlling the operation of the robot arm and a grasping portion, and the control portion is so formed as to detach the first imaging portion from the robot arm before moving an object to be grasped that is being grasped by the grasping portion to a prescribed processing position.
    Type: Application
    Filed: February 15, 2012
    Publication date: October 18, 2012
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Yoshimitsu NAKAHARA
  • Publication number: 20120263347
    Abstract: A three-dimensional scanner according to one aspect of the embodiments includes an irradiation unit, an imaging unit, a position detecting unit, and a scanning-region determining unit. The irradiation unit emits a slit-shaped light beam while changing an irradiation position with respect to a measuring object. The imaging unit sequentially captures images of the measuring object irradiated with the light beam. The position detecting unit detects a position of the light beam in an image captured by the imaging unit by scanning the image. The scanning-region determining unit determines a scanning region in an image as a scanning target by the position detecting unit based on a position of the light beam in an image captured by the imaging unit before the image as a scanning target.
    Type: Application
    Filed: March 5, 2012
    Publication date: October 18, 2012
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Yuji ICHIMARU
  • Publication number: 20120265341
    Abstract: The robotic work object cell calibration method includes a work object or emitter. Initially, placing the work object is placed in a selected position on a fixture or work piece on the shop floor. The work object emits a pair of beam-projecting lasers which intersect at a tool contact point and act as a crosshair. The robot tool is manipulated into the tool contact point. The work object emits four plane-projecting lasers which are used to adjust the roll, yaw, and pitch of the robot tool relative to the tool contact point. The robotic work object cell calibration method of the present invention increases the accuracy of the off-line programming and decreases robot teaching time.
    Type: Application
    Filed: March 7, 2012
    Publication date: October 18, 2012
    Inventor: Matthew E. Trompeter
  • Publication number: 20120265343
    Abstract: An autonomous coverage robot detection system includes an emitter configured to emit a directed beam, a detector configured to detect the directed beam and a controller configured to direct the robot in response to a signal detected by the detector. In some examples, the detection system detects a stasis condition of the robot. In some examples, the detection system detects a wall and can follow the wall in response to the detected signal.
    Type: Application
    Filed: December 8, 2011
    Publication date: October 18, 2012
    Inventors: Duane L. Gilbert, JR., Marcus R. Williams, Andrea M. Okerholm, Elaine H. Kristant, Sheila A. Longo, Daniel E. Kee, Marc D. Strauss
  • Patent number: 8290618
    Abstract: The position of a movable element such as the end of a robot arm (10) which has several degrees of freedom, being mounted on a base (11) and including a wrist mechanism, is determined by installing a multiplicity of base targets (32, 74) around the base of the robot, and a multiplicity of arm targets (42, 74) around the base (14) of the wrist mechanism (15), and an optical means (90) that moves with the movable element to determine the positions of at least some of the base targets and of at least some of the arm targets. The optical means may be a laser tracker or a camera system (90), and it may be mounted on the part (13) of the robot arm to which the base (14) of the wrist mechanism (15) is connected. This enables existing robots (10) to achieve absolute positional accuracy relative to a fixed external frame of reference.
    Type: Grant
    Filed: March 3, 2008
    Date of Patent: October 16, 2012
    Assignee: CNOS Automations Software GmbH
    Inventor: Andreas Haralambos Demopoulos
  • Patent number: 8290621
    Abstract: Motion information of a robot arm stored in a motion information database is acquired. A person manipulates the robot arm, and correction motion information at the time of the motion correction is acquired. An acquiring unit acquires environment information. A motion correction unit corrects the motion information while the robot arm is in motion. A control rule generating unit generates a control rule for allowing the robot arm to automatically operate based on the corrected motion information and the acquired environment information. The motion of the robot arm is controlled based on the generated control rule.
    Type: Grant
    Filed: March 9, 2012
    Date of Patent: October 16, 2012
    Assignee: Panasonic Corporation
    Inventor: Yuko Tsusaka
  • Publication number: 20120257704
    Abstract: An inspection robot for inspecting a nuclear reactor that includes a hull and an on-board control mechanism that controls the operation of the inspection robot. The on-board control mechanism controls one or more sensors used to inspect one or more structures in the nuclear reactor as well as the movement by the inspection robot. A gimbal mechanism rotates the inspection robot hull by shifting the center-of-mass so that gravity and buoyancy forces generate a moment to rotate the hull in a desired direction. A camera is coupled to the gimbal mechanism for providing visual display of the one or more structures in the nuclear reactor. The camera is allowed to rotate about an axis using the gimbal mechanism. The inspection robot communicates its findings with respect to the inspection tasks using the wireless communication link.
    Type: Application
    Filed: November 2, 2011
    Publication date: October 11, 2012
    Applicants: MITSUBISHI HEAVY INDUSTRIES, LTD., MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Haruhiko Harry Asada, Anirban Mazumdar, Ian C. Rust, Jun Fujita
  • Publication number: 20120259465
    Abstract: A cleaning system including a first virtual wall, a second virtual wall and a cleaning robot is disclosed. The first virtual wall includes a first specific pattern. When a light emits the first specific pattern, a first specific reflected light is generated. The second virtual wall includes a second specific pattern. When the light emits the second specific pattern, a second specific reflected light is generated. The cleaning robot, based on the first and the second specific reflected lights, obtains and records positions of the first and the second virtual walls. The cleaning robot defines a first virtual line according to the recorded positions. A traveling path of the cleaning robot is limited by the first virtual line.
    Type: Application
    Filed: May 24, 2011
    Publication date: October 11, 2012
    Inventors: Shui-Shih CHEN, You-Wei Teng
  • Publication number: 20120253511
    Abstract: A robot arm apparatus includes an arm mechanism including a base member and a link pivotally connected to the base member for pivotal motion in a horizontal plane through a rotational shaft. The link holds a regular circular transport object at its distal end. The apparatus also includes an edge detector, provided on the base member, that detects two edges of the regular circular transport object as the link pivotally rotates with respect to the base member, a pivotal angle detector that detects a pivotal angle of the link with respect to the base member, and a center position calculator that calculates a center position of the regular circular transport object with respect to the link. The calculation is based on two pivotal angles detected by the pivotal angle detector when the edge detector detects the two edges of the regular circular transport object.
    Type: Application
    Filed: March 21, 2012
    Publication date: October 4, 2012
    Applicant: SINFONIA TECHNOLOGY CO., LTD.
    Inventors: Toru SAEKI, Yasumichi Mieno, Yuji Urabe, Toshio Kamigaki, Yoji Masui
  • Publication number: 20120253507
    Abstract: A high throughput parcel unloading system includes a robotic arm arrangement, including a cluster of robotic arms having grouping mechanisms. A conveyor system is also provided onto which parcels are placed by the robotic arm system. An image recognition system determines the position and arrangement of parcels within a container, and a control system is configured to receive image information from the image recognition system and control operation of the robotic arm system and conveyor system.
    Type: Application
    Filed: April 4, 2011
    Publication date: October 4, 2012
    Applicant: PALO ALTO RESEARCH CENTER INCORPORATED
    Inventors: Craig Eldershaw, Eric J. Shrader
  • Patent number: 8280551
    Abstract: A manipulator includes at least one camera capable of observing an end effector from a direction suitable for work. A rotating portion rotatable coaxially with the end effector is provided to a link adjacent to a link located at a manipulator tip end. At least one camera for recognizing a work piece as an object is arranged on the rotating portion through a camera platform. An actuator for controlling a rotation angle of the rotating portion is driven according to a rotation angle of the link located at the manipulator tip end, and thus the camera is arranged in a direction perpendicular to a plane where the end effector can move when the end effector performs a grip work. In an assembly work, the rotating portion is rotated such that the camera is arranged in a direction parallel to the plane where the end effector can move.
    Type: Grant
    Filed: May 12, 2010
    Date of Patent: October 2, 2012
    Assignee: Canon Kabushiki Kaisha
    Inventor: Kota Tani
  • Publication number: 20120239196
    Abstract: The subject disclosure is directed towards controlling a robot based upon sensing a user's natural and intuitive movements and expressions. User movements and/or facial expressions are captured by an image and depth camera, resulting in skeletal data and/or image data that is used to control a robot's operation, e.g., in a real time, remote (e.g., over the Internet) telepresence session. Robot components that may be controlled include robot “expressions” (e.g., audiovisual data output by the robot), robot head movements, robot mobility drive operations (e.g., to propel and/or turn the robot), and robot manipulator operations, e.g., an arm-like mechanism and/or hand-like mechanism.
    Type: Application
    Filed: March 15, 2011
    Publication date: September 20, 2012
    Applicant: Microsoft Corporation
    Inventors: Charles F. Olivier, III, Jean Sebastien Fouillade
  • Patent number: 8271129
    Abstract: A power-saving robot system includes at least one peripheral device and a mobile robot. The peripheral device includes a controller having an active mode and a hibernation mode, and a wireless communication component capable of activation in the hibernation mode. A controller of the robot has an activating routine that communicates with and temporarily activates the peripheral device, via wireless communication, from the hibernation mode. In another aspect, a robot system includes a network data bridge and a mobile robot. The network data bridge includes a broadband network interface, a wireless command interface, and a data bridge component. The data bridge component extracts serial commands received via the broadband network interface from an internet protocol, applies a command protocol thereto, and broadcasts the serial commands via the wireless interface. The mobile robot includes a wireless command communication component that receives the serial commands transmitted from the network data bridge.
    Type: Grant
    Filed: December 4, 2006
    Date of Patent: September 18, 2012
    Assignee: iRobot Corporation
    Inventors: Michael J. Halloran, Jeffrey W. Mammen, Tony L. Campbell, Jason S. Walker, Paul E. Sandin, John N. Billington, Jr., Daniel N. Ozick
  • Publication number: 20120233062
    Abstract: An automated vehicle charging system, that may be done within a service type station, to provide for charging, recharging, or even discharging, of the batteries of an electric vehicle, and generally will include a dispenser, having a cabinet containing all of the instrumentation desired for furnishing the provision of current information relative to the charging of a vehicle, of otherwise, and will include boom means that are highly maneuverable, in order to bring the charging instrument into close proximity of the electrical receptacle of the vehicle being serviced, whether it be at a service type station, or at a curbside type of charging system. Robotics may be used within the structure of these electrical charging systems, to facilitate the charging of any vehicle, by the customer itself, even at a self service type of station.
    Type: Application
    Filed: February 27, 2012
    Publication date: September 13, 2012
    Inventor: Kevin Terrill Cornish
  • Publication number: 20120232697
    Abstract: Disclosed are a robot cleaner capable of performing a cleaning operation by selecting a cleaning algorithm suitable for the peripheral circumstances based on an analysis result of captured image information, and a controlling method thereof. The robot cleaner comprises an image sensor unit configured to capture image information when an operation instructing command is received, and a controller configured to analyze the image information captured by the image sensor unit, and configured to control a cleaning operation based on a first cleaning algorithm selected from a plurality of pre-stored cleaning algorithms based on a result of the analysis.
    Type: Application
    Filed: November 11, 2010
    Publication date: September 13, 2012
    Inventors: Jeihun Lee, Suuk Choe, Hyuksoo Son, Donghoon Yi, Younggie Kim, Jeongsuk Yoon, Seongsoo Lee, Taegon Park, Yiebin Kim, Yoojin Choi, Sangik Na, Seungmin Baek
  • Patent number: 8265793
    Abstract: A mobile robot provides telecommunication service between a remote user at a remote terminal and a local user in proximity to the mobile robot. The remote user can connect to the mobile robot via the Internet using a peer-to-peer VoIP protocol, and control the mobile robot to navigate about the mobile robot's environment. The mobile robot includes a microphone, a video camera and a speaker for providing telecommunication functionality between the remote user and the local user. Also, a hand-held RC unit permits the local user to navigate the mobile robot locally or to engage privacy mode for the mobile robot. When NAT or a firewall obstructs connection from the remote terminal to the mobile robot, an Internet server facilitates connection using methods such as STUN, TURN, or relaying.
    Type: Grant
    Filed: September 27, 2007
    Date of Patent: September 11, 2012
    Assignee: Irobot Corporation
    Inventors: Matthew Cross, Tony Campbell
  • Publication number: 20120226382
    Abstract: A robot-position detecting device includes: a position-data acquiring unit that acquires position data indicating actual positions of a robot; a position-data input unit that receives the position data output from the position-data acquiring unit; and a position calculating unit that calculates a computational position of the robot through linear interpolation using first and second position data input to the position-data input unit at different times.
    Type: Application
    Filed: February 13, 2012
    Publication date: September 6, 2012
    Applicant: SEIKO EPSON CORPORATION
    Inventor: Atsushi ASADA
  • Publication number: 20120218405
    Abstract: A system and a method for monitoring painting quality of components, for example of motor-vehicle bodies, comprises a robot which moves a monitoring head to follow the components to be monitored while they move along a production line. The monitoring head moves with respect to the surface to be monitored and comprises both a source of light and a camera which receives the light emitted by the source of light which is reflected by the monitored surface. An electronic processing unit receives the signals coming from the camera and processes them according to different processing algorithms for detecting various categories of defects, specifically small defects, medium defects, and large defects.
    Type: Application
    Filed: November 29, 2011
    Publication date: August 30, 2012
    Inventors: Andrea Terreno, Alessandro Cisi, Giorgio Pasquettaz
  • Publication number: 20120215354
    Abstract: A semi-autonomous robot system (10) that includes scanning and scanned data manipulation that is utilized for controlling remote operation of a robot system within an operating environment.
    Type: Application
    Filed: October 27, 2010
    Publication date: August 23, 2012
    Applicant: BATTELLE MEMORIAL INSTITUTE
    Inventors: Darren P. Krasny, Richard L. Shoaf, Jeffrey D. Keip, Scott A. Newhouse, Timothy J. Lastrapes
  • Patent number: 8249747
    Abstract: A robot safety system configured to protect humans in the vicinity of a working robot (1, 11, 21, 31) against harmful impacts by said robot (1, 11, 21, 31), said safety system comprising a sensor system (3, 13, 23) and a safety controller (4, 14, 24) configured to establish an impact risk profile of the robot (1, 11, 21, 31) and deliver an operating signal to a robot controller (2, 12, 22) based on said impact risk profile, wherein the safety controller (4, 14, 24) is configured to establish the impact risk profile based on stored data and input signals, and that the stored data and input signals comprise stored impact data, stored data related to the path of the robot (1, 11, 21, 31), and signals from the sensor system of events in the vicinity of the robot (1, 11, 21, 31), such as a detected human (P1, P11, P21, P22, P31, P32) in the vicinity of the robot (1, 11, 21, 31).
    Type: Grant
    Filed: December 3, 2008
    Date of Patent: August 21, 2012
    Assignee: ABB Research Ltd
    Inventor: Soenke Kock
  • Publication number: 20120204807
    Abstract: In certain embodiments, a system includes a controller operable to access an image signal generated by a camera. The accessed image signal corresponds to one or more features of the rear of a dairy livestock. The controller is further operable to determine positions of each of the hind legs of the dairy livestock based on the accessed image signal. The controller is further operable to determine a position of an udder of the dairy livestock based on the accessed image signal and the determined positions of the hind legs of the dairy livestock. The controller is further operable to determine, based on the image signal and the determined position of the udder of the dairy livestock, a spray position from which a spray tool may apply disinfectant to the teats of the dairy livestock.
    Type: Application
    Filed: April 24, 2012
    Publication date: August 16, 2012
    Applicant: Technologies Holdings Corp.
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Publication number: 20120204805
    Abstract: In certain embodiments, a system includes a controller operable to access an image signal generated by a camera. The accessed image signal corresponds to one or more features of the rear of a dairy livestock. The controller is further operable to determine positions of each of the hind legs of the dairy livestock based on the accessed image signal. The controller is further operable to determine a position of an udder of the dairy livestock based on the accessed image signal and the determined positions of the hind legs of the dairy livestock. The controller is further operable to determine, based on the image signal and the determined position of the udder of the dairy livestock, a spray position from which a spray tool may apply disinfectant to the teats of the dairy livestock.
    Type: Application
    Filed: April 24, 2012
    Publication date: August 16, 2012
    Applicant: Technologies Holdings Corp.
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Publication number: 20120209433
    Abstract: Social robot formed by an artificial vision system composed of webcam cameras, a voice recognition system formed by three microphones arranged in a triangular configuration, an expression system composed of an LED matrix, formed by a plurality of LEDs and a status LED, and eyelids formed by half-moons connected to gearwheels which engage with respective servomotors via transmission wheels, a speech synthesis system composed of loudspeakers, a system for detecting obstacles which is formed by ultrasound sensors, and a movement system formed by two driving wheels.
    Type: Application
    Filed: October 8, 2010
    Publication date: August 16, 2012
    Applicant: THECORPORA, S.L.
    Inventor: Francisco Javier Paz Rodriguez
  • Publication number: 20120209429
    Abstract: A robot apparatus includes: an image pickup device; a goal-image storing unit that stores, according to sensitivity represented by an amount of change of a pixel value at the time when a target aligned with a goal position on an image at a pixel level is displaced by a displacement amount at a sub-pixel level, goal image data in a state in which the target is arranged; and a target detecting unit that calculates a coincident evaluation value of the target on the basis of comparison of image data including the target and the goal image data stored by the goal-image storing unit and detects positional deviation of the target with respect to the goal position on the basis of the coincidence evaluation value.
    Type: Application
    Filed: February 8, 2012
    Publication date: August 16, 2012
    Applicant: SEIKO EPSON CORPORATION
    Inventors: Yukihiro YAMAGUCHI, Shingo KAGAMI, Kenji MATSUNAGA, Koichi HASHIMOTO
  • Publication number: 20120209432
    Abstract: A brain-based device (BBD) for moving in a real-world environment has sensors that provide data about the environment, actuators to move the BBD, and a hybrid controller which includes a neural controller having a simulated nervous system being a model of selected areas of the human brain and a non-neural controller based on a computational algorithmic network. The neural controller and non-neural controller interact with one another to control movement of the BBD.
    Type: Application
    Filed: April 4, 2012
    Publication date: August 16, 2012
    Applicant: NEUROSCIENCES RESEARCH FOUNDATION, INC.
    Inventors: Jason G. Fleischer, Botond Szatmary, Donald B. Hutson, Douglas A. Moore, James A. Snook, Gerald M. Edelman, Jeffrey L. Krichmar
  • Publication number: 20120209430
    Abstract: A position detection device for a horizontal articulated robot includes a camera for imaging the robot or a work as an imaging object, a control section for calculating a location of the imaging object from an image, an acquisition section (I/O) for obtaining the drive amounts of first and second electric motors of the robot, and a storage section for storing the calculated location of the imaging object and the drive amounts so as to correspond to each other. A common trigger signal for detecting the location of the imaging object is input to the camera and the I/O. The camera starts to image the imaging object in response to the input of the trigger signal. The I/O starts to obtain the drive amounts in response to the input of the trigger signal.
    Type: Application
    Filed: February 14, 2012
    Publication date: August 16, 2012
    Applicant: SEIKO EPSON CORPORATION
    Inventors: Katsuji IGARASHI, Atsushi ASADA, Nobuyuki SETSUDA, Chieko IUCHI
  • Publication number: 20120209431
    Abstract: A robotic system that can be used to treat a patient. The robotic system includes a mobile robot that has a camera. The mobile robot is controlled by a remote station that has a monitor. A physician can use the remote station to move the mobile robot into view of a patient. An image of the patient is transmitted from the robot camera to the remote station monitor. A medical personnel at the robot site can enter patient information into the system through a user interface. The patient information can be stored in a server. The physician can access the information from the remote station. The remote station may provide graphical user interfaces that display the patient information and provide both a medical tool and a patient management plan.
    Type: Application
    Filed: March 28, 2012
    Publication date: August 16, 2012
    Inventors: Timothy C. Wright, Fuji Lai, Marco Pinter, Yulun Wang
  • Publication number: 20120201448
    Abstract: A robotic device includes an imaging section adapted to take an image of an object having a hole, and generate an image data of the object including an inspection area of an image of the hole, a robot adapted to move the imaging section, an inspection area luminance value detection section adapted to detect a luminance value of the inspection area from the image data, a reference area luminance value detection section adapted to detect a luminance value of a reference area adjacent to the inspection area from the image data, and a determination section adapted to determine a state of the inspection area based on one of a ratio and a difference between the luminance value of the inspection area detected by the inspection area luminance value detection section and the luminance value of the reference area detected by the reference area luminance value detection section.
    Type: Application
    Filed: February 2, 2012
    Publication date: August 9, 2012
    Applicant: SEIKO EPSON CORPORATION
    Inventors: Takashi NAMMOTO, Koichi HASHIMOTO, Tomohiro INOUE
  • Publication number: 20120197438
    Abstract: The dual arm robot includes a first arm including a first hand, a first visual sensor and a first force sensor, and a second arm including a second hand, a second visual sensor and a second force sensor, uses each visual sensor to detect positions of a lens barrel and a fixed barrel to hold and convey them to a central assembling area, uses the first visual sensor to measure a position of a flexible printed circuits to insert the flexible printed circuits into the fixed barrel, and uses outputs of the force sensors to fit and assemble the fixed barrel onto the lens barrel under force control. The dual arm robot converts a position coordinate of a workpiece detected by each visual sensor to a robot coordinate to calculate a trajectory of each hand and drive each arm, to thereby realize cooperative operation of the two arms.
    Type: Application
    Filed: November 29, 2010
    Publication date: August 2, 2012
    Applicant: CANON KABUSHIKI KAISHA
    Inventor: Kazunori Ogami
  • Publication number: 20120197439
    Abstract: A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
    Type: Application
    Filed: January 27, 2012
    Publication date: August 2, 2012
    Applicants: INTOUCH HEALTH, IROBOT CORPORATION
    Inventors: Yulun Wang, Charles S. Jordan, Tim Wright, Michael Chan, Marco Pinter, Kevin Hanrahan, Daniel Sanchez, James Ballantyne, Cody Herzog, Blair Whitney, Fuji Lai, Kelton Temby, Eben Christopher Rauhut, Justin H. Kearns, Cheuk Wah Wong, Timothy Sturtevant Farlow
  • Patent number: 8234011
    Abstract: A technique to wholly recognize the surrounding environment may be provided by excluding unknown environment which arises due to parts of a body of a robot hindering the sight of the robot during operations. The robot of the present invention is provided with a body trunk including head and torso, at least one connected member that is connected to the body trunk by a joint in which a driving mechanism is provided, a body trunk side camera that is arranged on the body trunk, and a connected member side camera that is arranged on the connected member. Further, the robot is provided with a composite image creation unit that creates composite image of a body trunk side image taken by the body trunk side camera and a connected member side image taken by the connected member side camera, such that a part of the body trunk side image is replaced with a part of the connected member side image so as to exclude the connected member from the body trunk side image.
    Type: Grant
    Filed: July 21, 2011
    Date of Patent: July 31, 2012
    Assignee: Toyota Jidosha Kabushiki Kaisha
    Inventors: Yuichiro Nakajima, Haeyeon Lee, Hideki Nomura
  • Publication number: 20120191287
    Abstract: The present invention relates to a control method for the localization and navigation of a mobile robot and a mobile robot using the same. More specifically, the localization and navigation of a mobile robot are controlled using inertial sensors and images, wherein local direction descriptors are employed, the mobile robot is changed in the driving mode thereof according to the conditions of the mobile robot, and errors in localization may be minimized.
    Type: Application
    Filed: July 28, 2009
    Publication date: July 26, 2012
    Applicant: YUJIN ROBOT CO., LTD.
    Inventors: Kyung Chul Shin, Seong Ju Park, Hee Kong Lee, Jae Young Lee, Hyung O Kim
  • Patent number: 8225892
    Abstract: A hybrid mobile robot includes a base link and a second link. The base link has a drive system and is adapted to function as a traction device and a turret. The second link is attached to the base link at a first joint. The second link has a drive system and is adapted to function as a traction device and to be deployed for manipulation. In another embodiment an invertible robot includes at least one base link and a second link. In another embodiment a mobile robot includes a chassis and a track drive pulley system including a tension and suspension mechanism. In another embodiment a mobile robot includes a wireless communication system.
    Type: Grant
    Filed: November 26, 2010
    Date of Patent: July 24, 2012
    Assignee: Pinhas Ben-Tzvi
    Inventor: Pinhas Ben-Tzvi
  • Publication number: 20120181096
    Abstract: A climbing robot for travelling over adhesive surfaces with endless traction mechanisms (14) and, fastened to them at a distance, controllable adhesive feet (21) that circulate with the endless traction mechanisms (14) along guides (17) in the travel plane, by means of which the adhesive sides of their adhesive elements (15) always point towards the travel surface and wherein the adhesive elements (15) that support and move the climbing robot are switched “ON” and lowered onto the adhesive surface and all of the other adhesive elements (15) are switched “OFF” and raised from the adhesive surface, wherein the climbing robot has square running gear (11), one foot plate (13) each with a guide (17) running around the edges for a multitude of adhesive feet (21) driven by traction mechanisms is arranged in two diagonally opposite corner areas (18) of the running gear (11), wherein the foot plates (13) are attached to a support bar (12) of the running gear (11) and the adhesive feet (21) and therefore their adhesive
    Type: Application
    Filed: September 25, 2010
    Publication date: July 19, 2012
    Inventor: Anton Niederberger
  • Publication number: 20120185097
    Abstract: A computer determines a first origin of a first coordinate system of a PCB, and controls a robotic arm to position a probe above the first origin. Furthermore, the computer determines a second origin of a second coordinate system of the robotic arm, and determines displacement values from the first origin to a test point in controlling movements of the robotic arm in the second coordinate system. A graph representing the test point is recognized in an image of the PCB, pixel value differences between the graph center and the image center are determined and converted to displacement correction values for controlling the movements of the robotic arm and determining 3D coordinates of the test point. The robotic arm is moved along a Z-axis of the second coordinate system to precisely position the probe on the test point of the PCB.
    Type: Application
    Filed: August 17, 2011
    Publication date: July 19, 2012
    Applicant: HON HAI PRECISION INDUSTRY CO., LTD.
    Inventors: SHEN-CHUN LI, HSIEN-CHUAN LIANG, SHOU-KUO HSU
  • Publication number: 20120185115
    Abstract: A remotely-controlled robotic apparatus for deactivating explosive ordinance, such as IEDs. The remotely-controlled robotic apparatus is provided with a robotic portion that provides mobility, and a laser portion configured to be aimed at an object of interest (for example an object believed to comprise an IED) and to deactivate the object by destroying, damaging or disconnecting at least one of a control apparatus and a power supply from the object of interest. The remotely-controlled robotic apparatus provides the capability to dispose of IEDs without having an explosion. The remote-controlled robotic apparatus can include a camera and an illumination source for examining objects of interest. A remote control console is provided for the use of an operator in controlling the robotic apparatus.
    Type: Application
    Filed: October 5, 2007
    Publication date: July 19, 2012
    Inventor: Jason Dean
  • Publication number: 20120185094
    Abstract: A mobile robot that includes a drive system, a controller in communication with the drive system, and a volumetric point cloud imaging device supported above the drive system at a height of greater than about one feet above the ground and directed to be capable of obtaining a point cloud from a volume of space that includes a floor plane in a direction of movement of the mobile robot. The controller receives point cloud signals from the imaging device and issues drive commands to the drive system based at least in part on the received point cloud signals.
    Type: Application
    Filed: February 22, 2011
    Publication date: July 19, 2012
    Applicant: iRobot Corporation
    Inventors: Michael Rosenstein, Michael Halloran, Steven V. Shamlian, Chikyung Won, Mark Chiappetta
  • Publication number: 20120185093
    Abstract: A robot mounting device includes a pair of spaced-apart arms adapted to retain robot body of a surveillance robot. The robot mounting device also includes a latching mechanism to secure the robot mounting device to a rifle. The positioning of the robot can be adjusted within robot mounting device to site a camera in the axle of the robot with respect to the rifle. The rifle can then be oriented to obtain visual imagery of an environment.
    Type: Application
    Filed: June 30, 2010
    Publication date: July 19, 2012
    Applicant: ReconRobotics, Inc.
    Inventors: Alex J. Kossett, Ernest Langdon, Wade D. Palashewski
  • Publication number: 20120185095
    Abstract: A mobile human interface robot that includes a base defining a vertical center axis and a forward drive direction and a holonomic drive system supported by the base. The drive system has first, second, and third driven drive wheels, each trilaterally spaced about the vertical center axis and having a drive direction perpendicular to a radial axis with respect to the vertical center axis. The robot further includes a controller in communication with the holonomic drive system, a torso supported above the base, and a touch sensor system in communication with the controller. The touch sensor system is responsive to human contact. The controller issues drive commands to the holonomic drive system based on a touch signal received from the touch sensor system.
    Type: Application
    Filed: February 22, 2011
    Publication date: July 19, 2012
    Applicant: iRobot Corporation
    Inventors: Michael Rosenstein, Chikyung Won, Geoffrey B. Lansberry, Steven V. Shamlian, Michael Halloran, Mark Chiappetta, Thomas P. Allen
  • Patent number: 8219352
    Abstract: Embodiments of the invention disclose a system and a method for determining a pose of a probe relative to an object by probing the object with the probe, comprising steps of: determining a probability of the pose using Rao-Blackwellized particle filtering, wherein a probability of a location of the pose is represented by a location of each particle, and a probability of an orientation of the pose is represented by a Gaussian distribution over orientation of each particle conditioned on the location of the particle, wherein the determining is performed for each subsequent probing until the probability of the pose concentrates around a particular pose; and estimating the pose of the probe relative to the object based on the particular pose.
    Type: Grant
    Filed: March 31, 2010
    Date of Patent: July 10, 2012
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Yuichi Taguchi, Tim Marks
  • Publication number: 20120173047
    Abstract: A self-propelled device is provided including a drive system, a spherical housing, and a biasing mechanism. The drive system includes one or more motors that are contained within the spherical housing. The biasing mechanism actively forces the drive system to continuously engage an interior of the spherical housing in order to cause the spherical housing to move.
    Type: Application
    Filed: January 3, 2012
    Publication date: July 5, 2012
    Inventors: Ian H. BERNSTEIN, Adam Wilson
  • Publication number: 20120173064
    Abstract: A coverage robot including a chassis, multiple drive wheel assemblies disposed on the chassis, and a cleaning assembly carried by the chassis. Each drive wheel assembly including a drive wheel assembly housing, a wheel rotatably coupled to the housing, and a wheel drive motor carried by the drive wheel assembly housing and operable to drive the wheel. The cleaning assembly including a cleaning assembly housing, a cleaning head rotatably coupled to the cleaning assembly housing, and a cleaning drive motor carried by cleaning assembly housing and operable to drive the cleaning head. The wheel assemblies and the cleaning assembly are each separately and independently removable from respective receptacles of the chassis as complete units.
    Type: Application
    Filed: December 8, 2011
    Publication date: July 5, 2012
    Inventors: Chikyung Won, Selma Svendsen, Paul E. Sandin, Scott Thomas Burnett, Deepak Ramesh Kapoor, Stephen A. Hickey, Robert Rizzari, Zivhthan A.C. Dubrovsky
  • Publication number: 20120165984
    Abstract: A mobile robot apparatus includes a video recognition unit for recognizing a position of an opening button mounted around a door through video analysis after acquiring peripheral video information. Further, the mobile robot apparatus includes a mobile controller for performing an operation on the opening button at the position recognized by the video recognition unit to generate an opening selection signal, thereby allowing a door control apparatus to open the door according to the generated opening selection signal.
    Type: Application
    Filed: December 13, 2011
    Publication date: June 28, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Sang Seung KANG, Jae Hong KIM, Joo Chan SOHN, Chan Kyu PARK
  • Publication number: 20120165986
    Abstract: A robot system (10) for picking parts (41) from a bin (40) use the image from one or more cameras (38) to determine if the robot gripper (24) has picked one part or more than one part and uses one or more images from one or more cameras (38) to determine the position/orientation of a picked part. If the robot (12) has picked more than one part from the bin (40) then attempt is made to return the excess picked parts to the bin (40). The position/orientation of a picked part that does not meet a predetermined criteria is changed.
    Type: Application
    Filed: August 26, 2010
    Publication date: June 28, 2012
    Applicant: ABB Research Ltd.
    Inventors: Thomas A. Fuhlbrigge, Carlos Martinez Martinez, Gregory F. Rossano, Steve W. West
  • Publication number: 20120165985
    Abstract: The present invention relates to a wind turbine maintenance system and a method of maintenance therein. A wind turbine maintenance system is provided, for carrying out a maintenance task in a nacelle of a wind turbine, comprising a maintenance robot, further comprising a detection unit, for identifying a fault in a sub-system in the nacelle and generating fault information, a processor unit, adapted to receive fault information from the detection unit and control the maintenance robot to perform a maintenance task, a manipulation arm to perform the maintenance task on the identified sub-system. In another aspect, a method of carrying out a maintenance task in a wind turbine is provided.
    Type: Application
    Filed: December 28, 2011
    Publication date: June 28, 2012
    Applicant: VESTAS WIND SYSTEMS A/S
    Inventors: Qinghua XIA, Tieling ZHANG, Adrian LIEW
  • Patent number: 8209055
    Abstract: An exemplary system for sensing the state and position of a robot is provided. The system measures the acceleration and angular velocity of the robot and calculates a velocity, and a displacement of the robot. The state of the robot according to the acceleration and the velocity vector, of the robot, is determined. The system includes an alarm that activates according to the state of the robot. The system also compensates for any inaccuracy of the measured displacements.
    Type: Grant
    Filed: March 6, 2009
    Date of Patent: June 26, 2012
    Assignees: Hong Fu Jin Precision Industry (ShenZhen) Co., Ltd., Hon Hai Precision Industry Co., Ltd.
    Inventor: Wen Shu
  • Publication number: 20120158184
    Abstract: Disclosed is a makeup system based on expert knowledge including: a makeup robot controlled to apply a cosmetic to a face of a user; a makeup server expert system including makeup information associated with makeup application and command profile information created by programming operation commands of the makeup robot; and a makeup client system configured to download a command profile for controlling operation of the makeup robot from the makeup server expert system, and transmit the command profile to the makeup robot.
    Type: Application
    Filed: November 9, 2011
    Publication date: June 21, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Jin-Suk Ma, Do Hyung Kim, Sun Ja Kim
  • Publication number: 20120158180
    Abstract: An object gripping apparatus includes an image capturing unit for capturing a region including a plurality of works, an obtaining unit for obtaining distance information of the region, a measurement unit for measuring three-dimensional positions/orientations of a plurality of gripping-candidate works out of the plurality of works based on the image and distance information, thereby generating three-dimensional position/orientation information, a selection unit for selecting a gripping-target work based on the three-dimensional position/orientation information, a gripping unit for gripping the gripping-target work, and an updating unit for updating the three-dimensional position/orientation information by measuring three-dimensional positions/orientations of the gripping-candidate works at a time interval during gripping of the gripping-target work.
    Type: Application
    Filed: December 6, 2011
    Publication date: June 21, 2012
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Yuichiro Iio, Yusuke Mitarai
  • Publication number: 20120155775
    Abstract: A walking robot and a simultaneous localization and mapping method thereof in which odometry data acquired during movement of the walking robot are applied to image-based SLAM technology so as to improve accuracy and convergence of localization of the walking robot. The simultaneous localization and mapping method includes acquiring image data of a space about which the walking robot walks and rotational angle data of rotary joints relating to walking of the walking robot, calculating odometry data using kinematic data of respective links constituting the walking robot and the rotational angle data, and localizing the walking robot and mapping the space about which the walking robot walks using the image data and the odometry data.
    Type: Application
    Filed: December 15, 2011
    Publication date: June 21, 2012
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: Sung Hwan AHN, Kyung Shik Roh, Suk June Yoon, Seung Yong Hyung