Optical Patents (Class 901/47)
  • Publication number: 20120152877
    Abstract: The solar energy and solar farms are used to generate energy and reduce dependence on oil (or for environmental purposes). The maintenance and repairs in big farms become very difficult, expensive, and inefficient, using human technicians. Thus, here, we teach using the robots with various functions and components, in various settings, for various purposes, to improve operations in big (or hard-to-access) farms, to automate, save money, reduce human mistakes, increase efficiency, or scale the solutions to very large scales or areas.
    Type: Application
    Filed: December 16, 2010
    Publication date: June 21, 2012
    Inventor: Saied Tadayon
  • Publication number: 20120158179
    Abstract: According to an embodiment, a target trajectory that takes into account the hardware constraints of a robot is generated, based on results obtained by calculating, temporally interpolating, and estimating image feature amounts from a captured image.
    Type: Application
    Filed: September 13, 2011
    Publication date: June 21, 2012
    Applicant: KABUSHIKI KAISHA TOSHIBA
    Inventor: Junichiro Ooga
  • Publication number: 20120150346
    Abstract: A system of distributed control of an interactive animatronic show includes a plurality of animatronic actors, at least one of the actors a processor and one or more motors controlled by the processor. The system also includes a network interconnecting each of the actors, and a plurality of sensors providing messages to the network, where the messages are indicative of processed information. Each processor executed software that schedules and/or coordinates an action of the actor corresponding to the processor in accordance with the sensor messages representative of attributes of an audience viewing the show and the readiness of the corresponding actor. Actions of the corresponding actor can include animation movements of the actor, responding to another actor and/or responding to a member f the audience. The actions can result in movement of at least a component of the actor caused by control of the motor.
    Type: Application
    Filed: September 22, 2011
    Publication date: June 14, 2012
    Applicant: DISNEY ENTERPRISES, INC.
    Inventor: Alexis Paul Wieland
  • Patent number: 8200354
    Abstract: A method for verifying completion of a task is provided. In various embodiments, the method includes obtaining location coordinates of at least one location sensor within a work cell. The at least one sensor is affixed to a tool used to operate on a feature of a structure to be assembled, fabricated or inspected. The method additionally includes, generating a virtual object locus based on the location coordinates of the at least one location sensor. The virtual object locus corresponds to a computerized schematic of the structure to be assembled and represents of all possible locations of an object end of the tool within the work cell. The method further includes, identifying one of a plurality of candidate features as the most likely to be the feature operated on by the tool. The identification is based on a probability calculation for each of the candidate features that each respective candidate feature is the feature operated on by the tool.
    Type: Grant
    Filed: April 21, 2006
    Date of Patent: June 12, 2012
    Assignee: The Boeing Company
    Inventors: Philip L. Freeman, Thomas E. Shepherd, Christopher K Zuver
  • Publication number: 20120143374
    Abstract: Embodiments of the invention provide an approach for reproducing a human action with a robot. The approach includes receiving data representing motions and contact forces of the human as the human performs the action. The approach further includes approximating, based on the motions and contact forces data, the center of mass (CoM) trajectory of the human in performing the action. Finally, the approach includes generating a planned robot action for emulating the designated action by solving an inverse kinematics problem having the approximated human CoM trajectory as a hard constraint and the motion capture data as a soft constraint.
    Type: Application
    Filed: December 5, 2011
    Publication date: June 7, 2012
    Applicant: DISNEY ENTERPRISES, INC.
    Inventors: Michael MISTRY, Akihiko MURAI, Katsu YAMANE, Jessica Kate HODGINS
  • Publication number: 20120143370
    Abstract: A robot positioning method includes the following steps. A optical sensing device is configured at a front end of a robot. Then, the optical sensing device captures a calibration plate image, and a relative position of the optical sensing device with respective to a calibration plate is calculated according to a Bundle Adjustment. A robot calibration method includes the following steps. An optical sensing device is driven to rotate around a reference axis of a calibration plate, so as to calculate a translation matrix between the calibration plate and the robot, and the optical sensing device is driven to translate along three orthogonal reference axes of the calibration plate, so as to calculate a rotation matrix between the calibration plate and the robot.
    Type: Application
    Filed: January 25, 2011
    Publication date: June 7, 2012
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Po-Huang Shieh, Shang-Chieh Lu, Bor-Tung Jiang, Kuo-Tang Huang, Chin-Kuei Chang
  • Publication number: 20120143028
    Abstract: A vital sign measurement robot which automatically measures vital signs, and a control method thereof. The vital sign measurement robot includes an input unit to receive vital sign measurement instructions, an image recognition unit to detect a distance between the robot and a person, vital signs of whom are to be measured, and a measurement portion of the body of the person, when the vital sign measurement instructions are received, a control unit to move electrodes provided on hands so as to locate the electrodes at the measurement portion of the body of the person, when the distance between the robot and the person and the measurement portion of the body of the person are detected, and a vital sign measurement unit to measure a vital sign, when the electrodes are located at the measurement portion of the body of the person.
    Type: Application
    Filed: November 15, 2011
    Publication date: June 7, 2012
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: Heum Yong PARK, Yong Jae KIM, Youn Baek LEE, Jeong Hun KIM, Kyung Shik ROH, Young Do KWON
  • Publication number: 20120143375
    Abstract: A milking robot for teat cup attachment includes a robot arm having a gripper for holding at least one teat cup at a time; an image recording device mounted on the robot arm and provided to record at least one image of the teats of a milking animal; and a control device provided to control the robot arm to position the teat cup at a teat of the milking animal based on the at least one image of her teats. The image recording device is, before being provided to record the at least one image of the teats of the milking animal, provided to record at least one image of her hind legs; and the control device is, before being provided to control the robot arm to attach the teat cup to the teat of the milking animal, provided to control the robot arm to move the teat cup between her hind legs, from her rear and towards her udder, based on the at least one image of her hind legs.
    Type: Application
    Filed: August 19, 2010
    Publication date: June 7, 2012
    Applicant: DELAVAL HOLDING AB
    Inventor: Marilyn Krukowski
  • Publication number: 20120135096
    Abstract: The compact resin molding machine is capable of efficiently performing a sequence of molding actions from feeding a work and resin to accommodating the molded work. The resin molding machine comprises: a work conveying mechanism including a robot, which has a robot hand for holding the work and which is capable of rotating and linearly moving; a work feeding section for feeding the work; a resin feeding section for feeding the resin; a press section including a molding die set, in which the work is resin-molded; a work accommodating section for accommodating the molded work; and a control section controlling the entire resin molding machine. The work feeding section, the resin feeding section, the press section and the work accommodating section are located to enclose a moving area of the robot of the work conveying mechanism.
    Type: Application
    Filed: November 23, 2011
    Publication date: May 31, 2012
    Inventors: Tetsuya MAEYAMA, Hidemichi KOBAYASHI, Shusaku TAGAMI, Yoshikazu MURAMATSU, Takayuki YAMAZAKI, Keiji KOYAMA, Hideaki NAKAZAWA, Hiroshi HARAYAMA, Kenji NISHIZAWA, Makoto KAWAGUCHI, Masahiko FUJISAWA, Hidetoshi OYA
  • Publication number: 20120121133
    Abstract: A face change detection system is provided, comprising an image input unit acquiring a plurality of input images, a face extraction unit extracting a face region of the input images, and a face change extraction unit detecting a face change in the input images by calculating an amount of change in the face region.
    Type: Application
    Filed: January 23, 2012
    Publication date: May 17, 2012
    Applicant: CRASID CO., LTD.
    Inventors: Heung-Joon PARK, Cheol-Gyun OH, Ik-Dong KIM, Jeong-Hun PARK, Yoon-Kyung SONG
  • Publication number: 20120116588
    Abstract: A robot system and a control method thereof in which, when a robot is located in a docking region, the robot calculates a distance by emitting infrared rays and detecting ultrasonic waves oscillated from a charging station, measures a distance from the charging station and performs docking with charging station. The distance between the robot and the charging station is precisely measured, thereby performing smooth and correct docking of the robot with the charging station. Further, the robot emits infrared rays only while performing docking with the charging station and thus reduces power consumption required for infrared ray emission, and wakes up a circuit in the charging station based on the infrared rays emitted from the robot and thus reduces power consumption of the charging station.
    Type: Application
    Filed: August 25, 2011
    Publication date: May 10, 2012
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: Dong Hun LEE, Dong Min Shin
  • Patent number: 8175750
    Abstract: Motion information of a robot arm stored in a motion information database is acquired. A person manipulates the robot arm, and correction motion information at the time of the motion correction is acquired. An acquiring unit acquires environment information. A motion correction unit corrects the motion information while the robot arm is in motion. A control rule generating unit generates a control rule for allowing the robot arm to automatically operate based on the corrected motion information and the acquired environment information. The motion of the robot arm is controlled based on the generated control rule.
    Type: Grant
    Filed: April 28, 2011
    Date of Patent: May 8, 2012
    Assignee: Panasonic Corporation
    Inventor: Yuko Tsusaka
  • Publication number: 20120109377
    Abstract: Robotic, telerobotic, and/or telesurgical devices, systems, and methods take advantage of robotic structures and data to calculate changes in the focus of an image capture device in response to movement of the image capture device, a robotic end effector, or the like. As the size of an image of an object shown in the display device varies with changes in a separation distance between that object and the image capture device used to capture the image, a scale factor between a movement command input may be changed in response to moving an input device or a corresponding master/slave robotic movement command of the system. This may enhance the perceived correlation between the input commands and the robotic movements as they appear in the image presented to the system operator.
    Type: Application
    Filed: December 13, 2011
    Publication date: May 3, 2012
    Applicant: Intuitive Surgical Operations, Inc.
    Inventors: John D. Stern, Robert G. Younge, David S. Gere, Gunter D. Niemeyer
  • Publication number: 20120109378
    Abstract: Disclosed are a robot refrigerator and a robot refrigerator system. The robot refrigerator can be remotely controlled. The robot refrigerator generates image information from a surrounding image and transmits the generated image information to a wireless communication device. Then, the wireless communication device remotely controls the robot refrigerator, or monitors or remotely controls the robot refrigerator in real time, so that the robot refrigerator can easily avoid an obstacle to thus minimize a movement time of the robot refrigerator. Thus, user convenience and system reliability can be improved.
    Type: Application
    Filed: August 6, 2010
    Publication date: May 3, 2012
    Inventors: Sangoh Kim, Sungil Park, Namgi Lee
  • Publication number: 20120109376
    Abstract: Disclosed are a robot cleaner and a method for controlling the same. The robot cleaner may prevent repeated executions of a cleaning operation by recognizing its position through an absolute position recognition unit, in a case that the cleaning operation is performed again after being forcibly stopped due to arbitrary causes. And, the robot cleaner may solve a position recognition error by a relative position recognition unit with using an image detection unit, and may effectively perform a cleaning operation based on a similarity between an image detected by the image detection unit and an image with respect to a cleaning region. This may improve the system efficiency and stability, and enhance a user's convenience.
    Type: Application
    Filed: October 25, 2011
    Publication date: May 3, 2012
    Inventors: Seongsu LEE, Yiebin KIM, Suuk CHOE, Donghoon YI, Seungmin BAEK
  • Publication number: 20120103367
    Abstract: A cleaning robot a dirt recognition device thereof and a cleaning method of the robot are disclosed. The recognition device includes an image collecting module and an image processing module. The image collecting module may be used for collecting the image information of the surface to be treated by the cleaning robot and sending the image information to the image processing module. The image processing module may divide the collected image information of the surface to be treated into N blocks, extract the image information of each block and process the image information in order to determine the dirtiest surface to be treated that corresponds to one of the N blocks. Through the solution provided by the present invention, the cleaning robot can make an active recognition to the dirt such as dust, so that it can get into the working area accurately and rapidly.
    Type: Application
    Filed: June 10, 2010
    Publication date: May 3, 2012
    Applicant: ECOVACS ROBOTICS (SUZHOU ) CO., LTD.
    Inventor: Jinju Tang
  • Publication number: 20120102374
    Abstract: A storage device testing system (100) includes at least one 310 robotic arm (200) defining a first axis (205) substantially normal to a 300 floor surface (10). The robotic arm is operable to rotate through a predetermined arc about and extend radially from the first axis. Multiple racks (300) are arranged around the robotic arm for servicing by the robotic arm. Each rack houses multiple test slots (310) that are each configured to receive a storage device transporter (550) configured to carry a storage device (500) for testing.
    Type: Application
    Filed: April 17, 2009
    Publication date: April 26, 2012
    Applicant: TERADYNE, INC.
    Inventors: Edward Garcia, Brian S. Merrow, Evgeny Polyakov, Walter Vahey, Eric L. Truebenbach
  • Publication number: 20120098958
    Abstract: According to the invention, the calibration measuring cycle is divided into several, particularly a plurality of partial cycles, with which one or more of the calibration measurements are associated. While maintaining the cycle, the partial cycles are now carried out in one of the positioning pauses such that the calibration measuring cycle is distributed over several, in particular a plurality of, positioning pauses and is integrated into the flow of the industrial process without interfering with the same.
    Type: Application
    Filed: June 24, 2010
    Publication date: April 26, 2012
    Applicant: LEICA GEOSYSTEMS AG
    Inventors: Bernhard Metzler, Bernd Walser, Beat Aebischer
  • Publication number: 20120098961
    Abstract: A shape measuring apparatus includes a laser emitter that emits a laser beam, a scanner that scans the laser beam emitted by the laser emitter over a region in which an object is placed, a camera that detects reflected light of the laser beam, a recognizer that performs three-dimensional measurement of the object on the basis of the detection result of the camera, and a controller that performs control so as to change a scanning range of the scanner in accordance with the region in which the object is placed, the region being detected by the camera.
    Type: Application
    Filed: September 21, 2011
    Publication date: April 26, 2012
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Hiroyuki HANDA, Ken Arie
  • Publication number: 20120101508
    Abstract: A movement compensating device of a surgical robot in which a surgical operation processing unit mounted with a surgical instrument is coupled to one end of a body section includes: an image information creating unit that creates image information corresponding to an image signal supplied from a camera unit; a recognition point information analyzing unit that creates analysis information on a distance and an angle between a recognition point recognized from image information pieces corresponding to a predetermined number of image frames and a predetermined reference point; a variation analyzing unit that creates variation information in distance and angle between two analysis information pieces continuously created; and a control command creating and outputting unit that creates and outputs a control command for adjusting the position of the surgical operation processing unit so that the variation in distance and angle included in the variation information be 0 (zero).
    Type: Application
    Filed: October 19, 2011
    Publication date: April 26, 2012
    Inventors: Seung Wook CHOI, Min Kyu Lee, Dong Myung Min
  • Publication number: 20120095597
    Abstract: Provided is a robot cleaner, and more particularly to a robot cleaner which detest whether a foreign material storage unit is separated. The robot cleaner includes a main body including a suction motor, a foreign material storage unit separably disposed within the main body, the foreign material storage unit storing foreign materials contained in sucked air, a foreign material cover for selectively shielding one side of the foreign material storage unit, and a detection unit for detecting whether the foreign material cover is opened.
    Type: Application
    Filed: July 6, 2009
    Publication date: April 19, 2012
    Inventors: Bong-Ju Kim, In-Bo Shim, Ji-Hoon Sung, Byung-Doo Yim, Sung-Guen Kim
  • Patent number: 8160745
    Abstract: A method for controlling a robot having at least one visual sensor. A target for a motion of the robot is defined. A motion control signal adapted for the robot reaching the target is calculated. A collision avoidance control signal based on the closest points of segments of the robot and a virtual object between the visual sensing means and the target is calculated. The motion control signal and the collision avoidance control signal are weighted and combined. The weight of the motion control signal is higher when a calculated collision risk is lower. The motion of the robot is controlled according to the combined signal so that no segment of the robot enters the space defined by the virtual object.
    Type: Grant
    Filed: March 21, 2008
    Date of Patent: April 17, 2012
    Assignee: Honda Research Institute Europe GmbH
    Inventor: Hisashi Sugiura
  • Patent number: 8160747
    Abstract: Systems and methods as described for providing visual telepresence to an operator of a remotely controlled robot. The robot includes both video cameras and pose sensors. The system can also comprise a head-tracking sensor to monitor the orientation of the operator's head. These signals can be used to aim the video cameras. The controller receives both the video signals and the pose sensor signals from the robot, and optionally receives head-tracking signals from the head-tracking sensor. The controller stitches together the various video signals to form a composite video signal that maps to a robot view. The controller renders an image to a display from that portion of the composite video signal that maps to an operator view. The relationship of the operator view to the robot view is varied according to the signals from the pose sensors and the head-tracking sensor.
    Type: Grant
    Filed: October 24, 2008
    Date of Patent: April 17, 2012
    Assignee: Anybots, Inc.
    Inventors: Trevor Blackwell, Daniel Casner, Scott Wiley
  • Patent number: 8160746
    Abstract: System and method for graphically allocating robot's working space are provided. The system includes an image extractor, a task-allocating server and a robot. A graphic user interface (GUI) of the task-allocating server includes a robot's working scene area, a space attribute allocating area and a robot's task area. Thus, a user assigns one certain space area in the robot's working scene area with a “wall” attribute, or another space area with a “charging station” attribute. Meanwhile, by using the GUI, the user directly assigns the robot to execute a specific task at a certain area. Hence, the user or remote controller facilitates the robot to provide safer and more effective service through his/her environment recognition.
    Type: Grant
    Filed: February 15, 2008
    Date of Patent: April 17, 2012
    Assignee: Industrial Technology Research Institute
    Inventors: Wei-Han Wang, Ya-Hui Tsai, Yen-Chun Lin
  • Patent number: 8160748
    Abstract: A method and apparatus for removing wires from a bale includes a conveyor system for moving one or more bales and a de-wiring station positioned adjacent the conveyor system. The de-wiring station includes a robot with an end tool. A bale that is bound by one or more wires is transferred by the conveyor system to a position proximate the de-wiring station. The robot with end tool moves to sense the location of the wires, cut the wires, collect the wires and deposit the wires in a collection hopper.
    Type: Grant
    Filed: February 24, 2009
    Date of Patent: April 17, 2012
    Assignee: Automatic Handling International
    Inventors: Daniel J. Pienta, David M. Pienta
  • Publication number: 20120079670
    Abstract: A dust inflow sensing unit and a robot cleaner having the same. The dust inflow sensing unit includes a light emitting element to emit a beam having a transmission region, a light receiving element having a reception region overlapping the transmission region of the light emitting element, and a guide member to restrict the reception region of the light receiving element to a designated range until the reception region reaches the light emitting element.
    Type: Application
    Filed: September 23, 2011
    Publication date: April 5, 2012
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Sang Sik Yoon, Jun Pyo Hong, Hwi Chan Jang
  • Publication number: 20120082535
    Abstract: A submersible robot for operating a tool relative to a surface of an underwater structure has a tool holder movably mounted on a support assembly provided with a driving arrangement for movably holding the tool in operative position relative to the surface. Position and orientation of the support assembly relative to the surface is locked and adjusted by locking and levelling arrangements. A programmable control unit operates the driving, locking and levelling arrangements and the tool and receives measurements from a sensor unit. The control unit has an operation mode wherein a positioning of the robot is determined and controlled as function of an initial position for defining a first work area, and shifted positions of the robot for defining additional work areas, the work areas having overlapping portions with one another for tracking displacements of the robot relative to the surface of the structure using the sensor unit.
    Type: Application
    Filed: September 30, 2011
    Publication date: April 5, 2012
    Inventors: Luc PROVENCHER, Stéphan Gendron, René Morin, Michel Blain
  • Publication number: 20120072024
    Abstract: A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and a microphone of the robot, respectively. The remote station may include a visual display that displays both a first screen field and a second screen field. The first screen field may display a video image provided by a robot camera. The second screen field may display information such as patient records. The information from the second screen field may be moved to the first screen field and also transmitted to the robot for display by a robot monitor. The user at the remote station may annotate the information displayed by the robot monitor to provide a more active video-conferencing experience.
    Type: Application
    Filed: October 20, 2011
    Publication date: March 22, 2012
    Inventors: Yulun Wang, Charles S. Jordan, Jonathan Southard, Marco Pinter
  • Publication number: 20120072021
    Abstract: An object is highly precisely moved by an industrial robot to an end position by the following steps, which are repeated until the end position is reached within a specified tolerance: Recording a three-dimensional image by means of a 3-D image recording device. Determining the present position of the object in the spatial coordinate system from the position of the 3-D image recording device the angular orientation of the 3-D image recording device detected by an angle measuring unit, the three-dimensional image, and the knowledge of features on the object. Calculating the position difference between the present position of the object and the end position. Calculating a new target position of the industrial robot while taking into consideration the compensation value from the present position of the industrial robot and a value linked to the position difference. Moving the industrial robot to the new target position.
    Type: Application
    Filed: May 26, 2010
    Publication date: March 22, 2012
    Applicant: LEICA GEOSYSTEMS AG
    Inventors: Bernd Walser, Bernhard Metzler, Beat Aebischer, Knut Siercks, Bo Pettersson
  • Publication number: 20120072023
    Abstract: A method of controlling a robot using a human-robot interface apparatus in two-way wireless communication with the robot includes displaying on a display interface a two-dimensional image, an object recognition support tool library, and an action support tool library. The method further includes receiving a selected object image representing a target object, comparing the selected object image with a plurality of registered object shape patterns, and automatically recognizing a registered object shape pattern associated with the target object if the target object is registered with the human-robot interface. The registered object shape pattern may be displayed on the display interface, and a selected object manipulation pattern selected from the action support tool library may be received. Control signals may be transmitted to the robot from the human-robot interface. Embodiments may also include human-robot apparatuses (HRI) programmed to remotely control a robot.
    Type: Application
    Filed: September 22, 2010
    Publication date: March 22, 2012
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventor: Yasuhiro Ota
  • Patent number: 8140188
    Abstract: An example method for allowing a robot to assist with a task, the task being carried out in an environment including one or more non-human objects each having associated object locations, comprises detecting one or more changes in object locations within the environment, predicting a task requirement (such as a future object location change, or task goal) by comparing the change in the object location with stored data, the stored data including object location changes associated with previously observed tasks; and providing robotic assistance to achieve the task requirement. Example apparatus are also disclosed.
    Type: Grant
    Filed: February 18, 2008
    Date of Patent: March 20, 2012
    Assignees: Toyota Motor Engineering & Manufacturing North America, Inc., Toyota Motor Corporation
    Inventors: Mori Takemitsu, Steven F. Kalik
  • Publication number: 20120059517
    Abstract: A system comprises: a measurement unit adapted to measure a position/orientation of at least one target object based on an image obtained by capturing the at least one target object; a selection unit adapted to select at least one grippable target object based on the position/orientation; a determination unit adapted to determine, as an object to be gripped, a grippable target object in a state with a highest priority from the at least one grippable target object based on priorities set in advance for states including gripping positions/directions; a gripping unit adapted to grip the object to be gripped in the state with the highest priority; and a changing unit adapted to change the state of the gripped object, to a state in which the gripped object is assembled to the other object.
    Type: Application
    Filed: September 6, 2011
    Publication date: March 8, 2012
    Applicant: CANON KABUSHIKI KAISHA
    Inventor: Osamu Nomura
  • Publication number: 20120051595
    Abstract: Disclosed are a mobile robot and a controlling method of the same. The mobile robot is capable of reducing a position recognition error and performing a precise position recognition even in the occurrence of a change of external illumination, through geometric constraints, when recognizing its position with using a low quality camera (e.g., camera having a low resolution). Furthermore, feature points may be extracted from images detected with using a low quality camera, and the feature points may be robustly matched with each other even in the occurrence of a change of external illumination, through geometric constraints due to the feature lines. This may enhance the performance of the conventional method for recognizing a position based on a camera susceptible to a illumination change, and improve the efficiency of a system.
    Type: Application
    Filed: August 26, 2011
    Publication date: March 1, 2012
    Inventors: Seongsu Lee, Sangik Na, Yiebin Kim, Seungmin Baek
  • Publication number: 20120053724
    Abstract: A robot system includes a manipulator; a work table arranged within a movement extent of the manipulator; an imaging unit for taking a two-dimensional image of the workpieces loaded on the work table; a workpiece supply unit for supplying workpieces onto the work table; and a control system for controlling operations of the manipulator and the imaging unit. The control system includes an imaging control unit for controlling the imaging unit to take the two-dimensional image of the workpieces loaded on the work table, a workpiece detecting unit for detecting a position and a posture of each of the workpieces loaded on the work table by comparing the two-dimensional image taken by the imaging unit with templates stored in advance, and a manipulator control unit for operating the manipulator to perform a work with respect to the workpieces detected by the workpiece detecting unit.
    Type: Application
    Filed: August 30, 2011
    Publication date: March 1, 2012
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Takeshi OKAMOTO, Kenji Matsufuji, Takurou Yano, Takuya Murayama, Yoshihisa Nagano
  • Publication number: 20120053728
    Abstract: The present invention relates to an object-learning robot and corresponding method. The robot comprises a gripper (14) for holding an object (11) to be learned to the robot (10); an optical system (16) having a field of view for introducing the object (11) to the robot (10) and for observing the gripper (14) and the object (11) held by the gripper (14); an input device (26) for providing an object identity of the object to be learned to the robot (10); a controller (24) for controlling the motion of the gripper (14) according to a predetermined movement pattern; and an image processing means (28) for analyzing image data obtained from the optical system (16) identifying the object (11) for association with the object identity. This enables the robot to learn the identity of new objects in a dynamic environment, even without an offline period for learning.
    Type: Application
    Filed: April 13, 2010
    Publication date: March 1, 2012
    Applicant: KONINKLIJKE PHILIPS ELECTRONICS N.V.
    Inventors: Boudewijn Theodorus Theodorus, Harry Broers
  • Publication number: 20120053727
    Abstract: A technique to wholly recognize the surrounding environment may be provided by excluding unknown environment which arises due to parts of a body of a robot hindering the sight of the robot during operations. The robot of the present invention is provided with a body trunk including head and torso, at least one connected member that is connected to the body trunk by a joint in which a driving mechanism is provided, a body trunk side camera that is arranged on the body trunk, and a connected member side camera that is arranged on the connected member. Further, the robot is provided with a composite image creation unit that creates composite image of a body trunk side image taken by the body trunk side camera and a connected member side image taken by the connected member side camera, such that a part of the body trunk side image is replaced with a part of the connected member side image so as to exclude the connected member from the body trunk side image.
    Type: Application
    Filed: July 21, 2011
    Publication date: March 1, 2012
    Inventors: Yuichiro Nakajima, Haeyeon Lee, Hideki Nomura
  • Publication number: 20120048208
    Abstract: In certain embodiments, a system includes a controller operable to access an image signal generated by a camera. The accessed image signal corresponds to one or more features of the rear of a dairy livestock. The controller is further operable to determine positions of each of the hind legs of the dairy livestock based on the accessed image signal. The controller is further operable to determine a position of an udder of the dairy livestock based on the accessed image signal and the determined positions of the hind legs of the dairy livestock. The controller is further operable to determine, based on the image signal and the determined position of the udder of the dairy livestock, a spray position from which a spray tool may apply disinfectant to the teats of the dairy livestock.
    Type: Application
    Filed: April 28, 2011
    Publication date: March 1, 2012
    Applicant: Technologies Holdings Corp.
    Inventors: Henk Hofman, Peter Willem van der Sluis, Ype Groensma
  • Publication number: 20120046820
    Abstract: Embodiments of the invention provide systems and methods for obstacle avoidance. In some embodiments, a robotically controlled vehicle capable of operating in one or more modes may be provided. Examples of such modes include teleoperation, waypoint navigation, follow, and manual mode. The vehicle may include an obstacle detection and avoidance system capable of being implemented with one or more of the vehicle modes. A control system may be provided to operate and control the vehicle in the one or more modes. The control system may include a robotic control unit and a vehicle control unit.
    Type: Application
    Filed: August 18, 2011
    Publication date: February 23, 2012
    Inventors: James ALLARD, Kathleen A. WIENHOLD, William Robert NORRIS, Anthony Francis CATALFANO
  • Publication number: 20120024142
    Abstract: A robotized arm (2) is installable on a vehicle (1) by a supporting plate (12), on which the arm is mounted. The vehicle includes a conventional passenger compartment (11), adapted to house operators. At the free end of the arm is supported an armament (3) and the robotized arm enables the armament to have at least four degrees of freedom in space.
    Type: Application
    Filed: May 25, 2011
    Publication date: February 2, 2012
    Applicant: OTO MELARA S.p.A.
    Inventor: Giuliano Franceschi
  • Publication number: 20120024091
    Abstract: A linear-motion telescopic mechanism according to the present invention includes a plurality of block members (22) by which an arbitrary arm length is achieved in such a manner that the plurality of block member (22) are rigidly connected to each other so as to elongate a linear-motion telescopic joint (J3). On the other hand, by separating the plurality of block members (22) one by one from a rigid alignment of the plurality of block members (22), the linear-motion telescopic joint (J3) is contracted. The block members (22) unfixed from the rigid alignment are still serially connected but not in a rigid manner. That is, the block members (22) thus unfixed can be flexed in any directions, and therefore can be housed inside a support member (1) in a compact manner.
    Type: Application
    Filed: December 17, 2009
    Publication date: February 2, 2012
    Applicant: KAWABUCHI MECHANICAL ENGINEERING LABORATORY, INC.
    Inventors: Ichiro Kawabuchi, Woo-Keun Yoon, Tetsuo Kotoku
  • Patent number: 8108072
    Abstract: In one embodiment of the invention, a method for a robotic system is disclosed to track one or more robotic instruments. The method includes generating kinematics information for the robotic instrument within a field of view of a camera; capturing image information in the field of view of the camera; and adaptively fusing the kinematics information and the image information together to determine pose information of the robotic instrument. Additionally disclosed is a robotic medical system with a tool tracking sub-system. The tool tracking sub-system receives raw kinematics information and video image information of the robotic instrument to generate corrected kinematics information for the robotic instrument by adaptively fusing the raw kinematics information and the video image information together.
    Type: Grant
    Filed: September 30, 2007
    Date of Patent: January 31, 2012
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Wenyi Zhao, Christopher J J Hasser, William C. Nowlin, Brian D. Hoffman
  • Publication number: 20120019627
    Abstract: Disclosed are a mobile robot with a single camera capable of performing a cleaning process with respect to surroundings, and capable of more precisely making a 3D map of the surroundings including a plurality of feature points, and a method for recognizing 3D surroundings of the same. According to the method, images of the surroundings are captured, and a preset number of particles with respect to feature points of a first image are projected to a second image based on matching information of feature points extracted from the two images sequentially captured, thereby extracting 3D information of the surroundings.
    Type: Application
    Filed: March 12, 2010
    Publication date: January 26, 2012
    Inventors: Yoo-Jin Choi, Young-Gie Kim, Jeong-Suk Yoon, Seong-Soo Lee, Yie-Bin Kim, Seung-Min Baek, Sang-Ik Na, Su-Uk Choe, Dong-Hoon Yi, Jei-Hun Lee
  • Publication number: 20120016522
    Abstract: An automatic process for the identification and working of defects (12) on used tyres (6) is disclosed, comprising the steps of assembling the used tyre (6) on rotative support and centring means (7), starting the rotation of the tyre (6), automatic scanning of the working surface (30) of the tyre (6) to identify the morphology of defects (12) by means of automatic identification means (8, 41) of defects (12), comparison between the morphology of the defect (12) identified and a virtual library of defects (12?) of reference for the selection of the working and of the tool (9) to be used, charging the tool (9) chosen on an anthropomorphic robot (31), and executing the selected working on the defect (12) to form a crater (13) that is suitable for being coated, said steps being managed by a computerized control unit (11) comprising said virtual library of defects (12?).
    Type: Application
    Filed: March 24, 2010
    Publication date: January 19, 2012
    Inventor: Leonardo Cappuccini
  • Patent number: 8095237
    Abstract: A method of three-dimensional object location and guidance to allow robotic manipulation of an object with variable position and orientation using a sensor array which is a collection of one or more sensors capable of forming a single image.
    Type: Grant
    Filed: August 6, 2003
    Date of Patent: January 10, 2012
    Assignee: RoboticVISIONTech LLC
    Inventors: Babak Habibi, Simona Pescaru, Mohammad Sameti, Remus Florinel Boca
  • Publication number: 20120004775
    Abstract: A robot apparatus includes a robot mechanism having a plurality of joints, and actuators that drive joint axes of the robot mechanism. The robot apparatus includes a robot controller that controls the driving of the actuators based on a cost function that is a function of torque reference inputs for the actuators.
    Type: Application
    Filed: March 10, 2010
    Publication date: January 5, 2012
    Applicant: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventor: Fukashi Andoh
  • Publication number: 20120004145
    Abstract: An oligonucleotide spotting robot for spotting oligonucleotide probes into a silicon wafer on which an array of lab-on-a-chip (LOC) devices are fabricated, each LOC device having a digital memory for data related to the reagents loaded into the LOC device, the oligonucleotide dispensing robot having an array of reservoirs for containing the oligonucleotide probes suspended in a liquid, an array of ejectors, each of the ejectors being configured for fluid communication with a corresponding one of the reservoirs respectively, a mounting surface for detachably mounting the array of LOC devices for movement relative to the ejectors, and, a control processor for operative control of the ejectors and the mounting surface, wherein, the control processor is configured to activate the ejectors, move the ejectors selected for activation into registration with one or more of the LOC devices and download the data specifically relevant to each of the LOC devices into the digital memory of that LOC device.
    Type: Application
    Filed: June 1, 2011
    Publication date: January 5, 2012
    Inventors: Mehdi Azimi, Kia Silverbrook
  • Publication number: 20110320039
    Abstract: A robot calibration system includes a robot, a calibration tool, a plane calibration board, a camera, and a controller. The calibration tool is assembled to the robot and is controlled by the robot to move along a preset trajectory. The plane calibration board is located under the calibration tool and has a plurality of characteristic corner points on one surface thereof. The camera is configured for capturing an image of the plane calibration board. The controller electrically connects with the robot and the camera respectively, the controller predefines a preset control program for controlling the robot and the camera to operate, and the controller is configured for calibrating the robot. The disclosure also discloses a method for calibrating a robot for use with a robot calibration system.
    Type: Application
    Filed: March 22, 2011
    Publication date: December 29, 2011
    Applicants: HON HAI PRECISION INDUSTRY CO., LTD., HONG FU JIN PRECISION INDUSTRY (ShenZhen) CO., LTD.
    Inventors: YUAN-CHE HSU, DU-XUE ZHANG, SHUI-PING WEI
  • Publication number: 20110320042
    Abstract: Methods and systems to improve operator control of mobile robots are disclosed. The invention comprises in various embodiments the aggregation of multiple image feeds to improve operator situational awareness and the dynamic selection of command reference frames to improve operator intuitive control. The disclosed methods and systems reduce operator workload, reduce task completion times, and extend the capabilities of mobile manipulation systems.
    Type: Application
    Filed: June 27, 2011
    Publication date: December 29, 2011
    Applicant: American Android Corp.
    Inventors: David A. Handelman, Haldun Komsuoglu, Gordon H. Franken
  • Publication number: 20110312854
    Abstract: An oligonucleotide spotting robot for spotting oligonucleotide probes into an array of lab-on-a-chip (LOC) devices, each having a digital memory for data related to the oligonucleotide probes loaded into that LOC device, the oligonucleotide dispensing robot having an array of reservoirs for containing the oligonucleotide probes suspended in a liquid, an array of ejectors, each of the ejectors being configured for fluid communication with a corresponding one of the reservoirs respectively, a mounting surface for detachably mounting the array of LOC devices for movement relative to the ejectors, and, a control processor for operative control of the ejectors and the mounting surface, wherein, the control processor is configured to activate the ejectors, move the ejectors selected for activation into registration with one or more of the LOC devices and download the data specifically relevant to each of the LOC devices into the digital memory of that LOC device.
    Type: Application
    Filed: June 1, 2011
    Publication date: December 22, 2011
    Inventors: Kia Silverbrook, Mehdi Azimi
  • Publication number: 20110311127
    Abstract: A motion space presentation device includes: a work area generation unit configured to generate a three-dimensional region in which the movable robot operates; an image capture unit configured to capture a real image; a position and posture detection unit configured to detect an image capture position and an image capture direction of the image capture unit; and an overlay unit configured to selectively superimpose dither an image of a segment approximation model of the movable robot as viewed in the image capture direction from the image capture position, or an image of the three-dimensional region as viewed in the image capture direction from the image capture position, on the real image captured by the image capture unit, according to the difficulty in recognizing each image.
    Type: Application
    Filed: August 30, 2011
    Publication date: December 22, 2011
    Inventors: Kenji MIZUTANI, Taichi SATO