Optical Patents (Class 901/47)
  • Patent number: 9576832
    Abstract: An article transport vehicle is provided in which it is difficult for the transported article support portions to interfere with the supported portion of the transported article while reducing transmission of vibrations to the transported article, during a travel of the travel member or during a vertical movement of the vertically movable portion. A support unit which is vertically moved relative to a travel member is provided with a guiding supporting portion for guiding and supporting transported article support portions for supporting a supported portion of a transported article such that the transported article support portions can be moved to support positions and to retracted positions, and a damper element which is located between a vertically movable portion and the guiding supporting portion and receives load from the guiding supporting portion and which is elastically deformable in a vertical direction.
    Type: Grant
    Filed: November 10, 2015
    Date of Patent: February 21, 2017
    Assignee: Daifuku Co., Ltd.
    Inventor: Daichi Tomida
  • Patent number: 9043016
    Abstract: Certain embodiments of the present invention provide robotic control modules for use in a robotic control system of a vehicle, including structures, systems and methods, that can provide (i) a robotic control module that has multiple functional circuits, such as a processor and accompanying circuits, an actuator controller, an actuator amplifier, a packet network switch, and a power supply integrated into a mountable and/or stackable package/housing; (ii) a robotic control module with the noted complement of circuits that is configured to reduce heat, reduce space, shield sensitive components from electro-magnetic noise; (iii) a robotic control system utilizing robotic control modules that include the sufficiently interchangeable functionality allowing for interchangeability of modules; and (iv) a robotic control system that distributes the functionality and processing among a plurality of robotic control modules in a vehicle.
    Type: Grant
    Filed: October 20, 2006
    Date of Patent: May 26, 2015
    Assignees: Deere & Company, iRobot Corporation
    Inventors: Mikhail O. Filippov, Osa Fitch, Scott P. Keller, John O'Connor, David S. Zendzian, Nadim El Fata, Kevin Larsen, Arlen Eugene Meuchel, Mark David Schmaltz, James Allard, Chris A. De Roo, William Robert Norris, Andrew Julian Norby, Christopher David Glenn Turner
  • Patent number: 9039685
    Abstract: A surgical instrument is provided, including: at least one articulatable arm having a distal end, a proximal end, and at least one joint region disposed between the distal and proximal ends; an optical fiber bend sensor provided in the at least one joint region of the at least one articulatable arm; a detection system coupled to the optical fiber bend sensor, said detection system comprising a light source and a light detector for detecting light reflected by or transmitted through the optical fiber bend sensor to determine a position of at least one joint region of the at least one articulatable arm based on the detected light reflected by or transmitted through the optical fiber bend sensor; and a control system comprising a servo controller for effectuating movement of the arm.
    Type: Grant
    Filed: March 16, 2011
    Date of Patent: May 26, 2015
    Assignee: INTUITIVE SURGICAL OPERATIONS, INC.
    Inventors: David Q. Larkin, David C. Shafer
  • Publication number: 20150142252
    Abstract: A robotic device includes a housing configured to house a mobile device. The robotic device also includes an articulating image director aligned with a field of view of a camera of the mobile device. The housing of the robotic device is positioned at an angle to provide a forward view or rear facing view to the camera via the articulating image director.
    Type: Application
    Filed: July 31, 2014
    Publication date: May 21, 2015
    Inventor: Donald Bolden HUTSON
  • Publication number: 20150135459
    Abstract: Provided is a blade maintenance device for a wind turbine. The blade maintenance device for the wind turbine includes: a body that travels along a leading edge of a blade; support units that extend from the body to both sides of the blade and support the sides of the blade; and a maintenance unit installed at at least one of the body and the support units so as to perform maintenance of an outer side of the blade.
    Type: Application
    Filed: August 22, 2012
    Publication date: May 21, 2015
    Applicant: SAMSUNG HEAVY IND. CO., LTD.
    Inventors: Byung Kyu Lee, Hong Gyeoum Kim, Jong Hwan Lee, Young Seok Cho, Young Youl Ha, In Chul Ha, Dong Ki Han
  • Publication number: 20150142171
    Abstract: Disclosed are methods adapted to calibrate a robot gripper to a camera. The method includes providing a robot with a coupled moveable gripper, providing one or more cameras, providing a target scene having one or more fixed target points, moving the gripper and capturing images of the target scene at two or more imaging locations, recording positions in the gripper coordinate system for each of the imaging locations, recording images in a camera coordinate system, and processing the images and positions to determine a gripper-to-camera transformation between the gripper coordinate system and the camera coordinate system. The transformation may be accomplished by nonlinear least-squares minimization, such as the Levenberg-Marquardt method. Robot calibration apparatus for carrying out the method are disclosed, as are other aspects.
    Type: Application
    Filed: August 10, 2012
    Publication date: May 21, 2015
    Applicant: SIEMENS HEALTHCARE DIAGNOSTICS INC.
    Inventors: Gang Li, Yakup Genc, Siddharth Chhatpar, Daniel Sacco, Sandeep Naik, Alexander Gelbman, Roy Barr
  • Patent number: 9032811
    Abstract: A robot apparatus includes an arm that includes an outer skin and a detector that detects the deformation of the outer skin. The detector includes a sending unit that sends a signal, a receiving unit that receives the signal, and a transmission route that is provided along the outer skin so as to lead the signal. The detector detects the deformation of the outer skin based on whether a signal reaches the receiving unit.
    Type: Grant
    Filed: January 21, 2014
    Date of Patent: May 19, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Nobukazu Miyauchi, Tamio Nakamura, Zenta Nakamoto, Dai Kouno
  • Patent number: 9037336
    Abstract: A robot system includes a planar sign, a robot, a distance direction sensor, and a controller. The controller is configured to control the robot and includes a map data memory and a progress direction determining device. The map data memory is configured to store map data of a predetermined running path including a position of the planar sign. The progress direction determining device is configured to compare a detection result of the distance direction sensor and the stored map data so as to determine a progress direction of the robot.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: May 19, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Dai Kouno, Tamio Nakamura
  • Patent number: 9037296
    Abstract: Disclosed are a robot cleaner, and a system and method for remotely controlling the robot cleaner. Since the robot cleaner is accessed to a terminal through a network, the robot cleaner can be controlled in more various manners. A situation inside a house can be real-time checked from the outside, and the satiation can be rapidly handled according to a state of the robot cleaner. Since the terminal and the robot cleaner are connected to each other through a network, a voice call function and an interphone function can be implemented. Furthermore, the robot cleaner can perform automatic cleaning, manual cleaning, and reservation cleaning in an autonomous manner, or in a remote-controlled manner.
    Type: Grant
    Filed: September 6, 2012
    Date of Patent: May 19, 2015
    Assignee: LG Electronics Inc.
    Inventors: Suuk Choe, Sunghun Lee, Junho Jang, Hyunwoong Park, Seungmin Baek, Yiebin Kim
  • Publication number: 20150134115
    Abstract: A method of operating a robot includes receiving image data from an image capture device of the robot. The image data is representative of a glyph viewed by the image capture device on the display of a computing device within a field of view of the image capture device. The method further includes determining, at a controller, a command message based on the glyph represented in the image data and issuing a command to at least one resource or component of the robot based on the command message.
    Type: Application
    Filed: November 12, 2013
    Publication date: May 14, 2015
    Applicant: iRobot Corporation
    Inventors: Sherman Gong, Rogelio Neumann
  • Publication number: 20150134079
    Abstract: Provided are a walk-assistive robot and a method of controlling the same. The method of controlling the walk-assistive robot includes: obtaining ground information that is information regarding ground a walking direction; determining control patterns of the walk-assistive robot by analyzing the obtained ground information; and controlling the walk-assistive robot based on the determined control patterns.
    Type: Application
    Filed: August 28, 2014
    Publication date: May 14, 2015
    Inventors: Suk June YOON, Kyung Shik ROH, Young Bo SHIM, Young Do KWON, Sung Hwan AHN, Hyo Seok HWANG
  • Publication number: 20150131896
    Abstract: A safety monitoring system for human-machine symbiosis is provided, including a spatial image capturing unit, an image recognition unit, a human-robot-interaction safety monitoring unit, and a process monitoring unit. The spatial image capturing unit, disposed in a working area, acquires at least two skeleton images. The image recognition unit generates at least two spatial gesture images corresponding to the at least two skeleton images, based on information of changes in position of the at least two skeleton images with respect to time. The human-robot-interaction safety monitoring unit generates a gesture distribution based on the at least two spatial gesture images and a safety distance. The process monitoring unit determines whether the gesture distribution meets a safety criterion.
    Type: Application
    Filed: March 25, 2014
    Publication date: May 14, 2015
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Jhen-Jia Hu, Hau-Wei Wang, Chung-Ning Huang
  • Patent number: 9031697
    Abstract: The present teachings provide a method of controlling a remote vehicle having an end effector and an image sensing device. The method includes obtaining an image of an object with the image sensing device, determining a ray from a focal point of the image to the object based on the obtained image, positioning the end effector of the remote vehicle to align with the determined ray, and moving the end effector along the determined ray to approach the object.
    Type: Grant
    Filed: April 16, 2012
    Date of Patent: May 12, 2015
    Assignee: iRobot Corporation
    Inventors: Wesley Hanan Huang, Emilie A. Phillips
  • Publication number: 20150122183
    Abstract: A method and system to compensate for stray light errors in time of flight (TOF) camera systems uses reference targets in the in the field of view (FOV) that can be used to measure stray light. In different embodiments, one or more reference targets are used.
    Type: Application
    Filed: January 9, 2015
    Publication date: May 7, 2015
    Inventor: Thierry Oggier
  • Publication number: 20150127160
    Abstract: A robot includes, a holding unit configured to hold an object, an image pickup unit, and a predetermined first portion of the robot. The image pickup unit picks up images of the holding unit and the object using the first portion as a background.
    Type: Application
    Filed: November 4, 2014
    Publication date: May 7, 2015
    Inventor: Yukihiro YAMAGUCHI
  • Publication number: 20150127156
    Abstract: A remote controlled robot with a head that supports a monitor and is coupled to a mobile platform. The mobile robot also includes an auxiliary camera coupled to the mobile platform by a boom. The mobile robot is controlled by a remote control station. By way of example, the robot can be remotely moved about an operating room. The auxiliary camera extends from the boom so that it provides a relatively close view of a patient or other item in the room. An assistant in the operating room may move the boom and the camera. The boom may be connected to a robot head that can be remotely moved by the remote control station.
    Type: Application
    Filed: January 12, 2015
    Publication date: May 7, 2015
    Inventors: Yulun Wang, Charles S. Jordan, Kevin Hanrahan, Daniel Sanchez, Marco Pinter
  • Publication number: 20150125035
    Abstract: To perform robust position and orientation measurement even in a situation where noise exist, an image including a target object is obtained, an approximate position and orientation of the target object included in the obtained image are obtained, information related to a shadow region of the target object in the obtained image is estimated, the approximate position and orientation are corrected on the basis of the estimated information related to the shadow region, and a position and orientation of the target object in the image are derived on the basis of the corrected approximate position and orientation and held model information.
    Type: Application
    Filed: November 3, 2014
    Publication date: May 7, 2015
    Inventors: Sonoko Miyatani, Kazuhiko Kobayashi
  • Publication number: 20150124242
    Abstract: A scanning optical range finder in a mobile robot includes an optical emitter circuit, a non-imaging optical element, an optical detector circuit, and a ranging circuit. The non-imaging optical element is arranged to receive optical signals at an entrance aperture thereof responsive to operation of the optical emitter circuit, and to direct the optical signals to an output aperture thereof. The optical detector circuit is configured to receive the optical signals from the output aperture of the non-imaging optical element, and to generate detection signals based on respective phase differences of the optical signals relative to corresponding outputs of the optical emitter circuit. The ranging circuit is configured to calculate a range of a target from the phase differences indicated by the detection signals. Related devices and methods of operation are also discussed.
    Type: Application
    Filed: October 31, 2014
    Publication date: May 7, 2015
    Inventors: Travis Pierce, Jamie Milliken, Marc Wilga
  • Publication number: 20150120055
    Abstract: An image acquisition unit acquires an image including an object, and a controller starts a visual servo using the acquired image, on the basis of at least one of an error in calibration, an error in installation of a robot, an error resulting from the rigidity of the robot, an error of a position where the robot has gripped the object, an error regarding imaging, and an error regarding a work environment. Additionally, the controller starts the visual servo when the distance between one point of a working unit of the robot and the object is equal to or greater than 2 mm.
    Type: Application
    Filed: October 29, 2014
    Publication date: April 30, 2015
    Inventors: Hiroshi MIYAZAWA, Yukihiro YAMAGUCHI, Nobuhiro KARITO
  • Publication number: 20150114236
    Abstract: A robotic cooking apparatus that can cook dishes using a computer recipe provided by a user and ingredients stored in it. Able to adjust the recipe based on user preference data. Cooking apparatus comprises of ingredient input section, cooking stations, pre-processing and post-processing station. Input section has trays and can be at room temperature or refrigerated for storing perishable ingredients. Robot head assembly is able to transfer ingredients, identify and measure as well as take inventory periodically. Cooking apparatus is connected to a network and can be operated remotely by a fixed or handheld device and monitor its operation remotely.
    Type: Application
    Filed: June 2, 2011
    Publication date: April 30, 2015
    Inventors: Shambhu Nath Roy, Riya Bhattacharya
  • Publication number: 20150120127
    Abstract: A mobile unit according to an embodiment includes a main body, a moving mechanism, a sensor, a recognizer, a first movement adjuster, a second landmark recognizer, and a second movement adjuster. The moving mechanism moves the main body. The sensor detects a distance and a direction to an object around the main body. The recognizer recognizes a landmark based on a detection result of the sensor. The first movement adjuster controls the moving mechanism such that the main body is moved to a target position based on the landmark. If the distance to the landmark has become smaller than a first threshold, the second landmark recognizer recognizes a second landmark. The second movement adjuster controls the moving mechanism such that the main body is moved to the target position based on the second landmark.
    Type: Application
    Filed: October 29, 2014
    Publication date: April 30, 2015
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Taku SHIKINA, Tamio NAKAMURA, Dai KOUNO, Takashi NISHIMURA
  • Publication number: 20150120057
    Abstract: A mobile robot including a robot body, a drive system supporting the robot body, and a controller in communication with the drive system. The robot also includes an actuator moving a portion of the robot body through a volume of space adjacent the mobile robot and a sensor pod in communication with the controller. The sensor pod includes a collar rotatably supported and having a curved wall formed at least partially as a surface of revolution about a vertical axis. The sensor pod also includes a volumetric point cloud sensor housed by the collar and observing the volume of space adjacent the robot from within the collar along an observation axis extending through the curved wall. A collar actuator rotates the collar and the volumetric point cloud sensor together about the collar axis.
    Type: Application
    Filed: December 30, 2014
    Publication date: April 30, 2015
    Applicant: iRobot Corporation
    Inventors: Cheuk Wah Wong, Eben Rauhut, Brian C. Benson, JR., Peter J. Lydon, Michael T. Rosenstein, Michael Halloran, Steven V. Shamlian, Chikyung Won, Mark Chiappetta, Justin H. Kearns, Orjeta Taka, Robert Todd Pack, Timothy S. Farlow, Jasper Fourways Vicenti
  • Publication number: 20150120047
    Abstract: A control device includes a reception unit that receives first operation information and second operation information different from the first operation information; and a process unit that instructs a robot to execute operations based on the first operation information and the second operation information using a plurality of captured images of an imaged target object, the images being captured multiple times while the robot moves from a first posture to a second posture different from the first posture.
    Type: Application
    Filed: October 22, 2014
    Publication date: April 30, 2015
    Inventors: Masaki MOTOYOSHI, Kenji ONDA, Hiroyuki KAWADA, Mitsuhiro INAZUMI
  • Publication number: 20150112482
    Abstract: A teaching system includes an image generating unit, a projecting unit, a work line generating unit, an arithmetic unit, and a job generating unit. The image generating unit generates a virtual image including a robot and a workpiece having a processed surface to be processed by the robot. The projecting unit generates a projection plane orthogonal to a normal direction of a desired point on the processed surface selected on the virtual image and projects the processed surface onto the projection plane. The work line generating unit generates a work line for the robot based on setting contents received via the projection plane. The arithmetic unit calculates a teaching value including the position and the posture of the robot at each point of the target points. The job generating unit generates a job program for operating the robot in an actual configuration based on the teaching value.
    Type: Application
    Filed: September 12, 2014
    Publication date: April 23, 2015
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Koichi KUWAHARA
  • Publication number: 20150107915
    Abstract: A low profile stepper robot is described and claimed herein. The robot includes a plurality of foot assemblies. Each foot assembly includes a suction cup, a vacuum generator, and a valve, with the vacuum generator being operationally connected to the suction cup. A conduit connects a source of operational fluid flow to the vacuum generators, and the valves allow or prevent fluid flow to the vacuum generators. Actuators are positioned between the foot assemblies and the robot base. The actuators provide for linear and rotational displacement of the foot assemblies, allowing the robot to walk and turn along an inspection surface.
    Type: Application
    Filed: October 21, 2014
    Publication date: April 23, 2015
    Inventors: S. William Glass, III, Bradley A. Thigpen, Robert A. Furter
  • Patent number: 9014848
    Abstract: A robot system includes a mobile robot having a controller executing a control system for controlling operation of the robot, a cloud computing service in communication with the controller of the robot, and a remote computing device in communication with the cloud computing service. The remote computing device communicates with the robot through the cloud computing service.
    Type: Grant
    Filed: February 22, 2011
    Date of Patent: April 21, 2015
    Assignee: iRobot Corporation
    Inventors: Timothy S. Farlow, Michael Rosenstein, Michael Halloran, Chikyung Won, Steven V. Shamlian, Mark Chiappetta
  • Patent number: 9010465
    Abstract: Devices, systems, and methods for inspecting and objectively analyzing the condition of a roof are presented. A vehicle adapted for traversing and inspecting an irregular terrain includes a chassis having a bottom surface that defines a higher ground clearance at an intermediate location, thereby keeping the center of mass low when crossing roof peaks. In another embodiment, the drive tracks include a partially collapsible treads made of resilient foam. A system for inspecting a roof includes a lift system and a remote computer for analyzing data. Vehicles and systems may gather and analyze data, and generate revenue by providing data, analysis, and reports for a fee to interested parties.
    Type: Grant
    Filed: June 13, 2014
    Date of Patent: April 21, 2015
    Assignee: Tobor Technology, LLC
    Inventors: Michael D. Slawinski, Dennis L. Guthrie
  • Publication number: 20150105964
    Abstract: Techniques that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized.
    Type: Application
    Filed: November 17, 2014
    Publication date: April 16, 2015
    Applicant: NEATO ROBOTICS, INC.
    Inventors: Boris SOFMAN, Vladimir ERMAKOV, Mark EMMERICH, Steven ALEXANDER, Nathaniel David MONSON
  • Publication number: 20150105907
    Abstract: A robot includes a control unit that controls a movable unit of the robot to move an endpoint of the movable unit closer to a target position, and an image acquisition unit that acquires a target image as an image containing the end point when the end point is in the target position, and a current image as an image containing the end point when the end point is in a current position. The control unit controls movement of the movable unit based on the current image and the target image and output from a force detection unit that detects a force acting on the movable unit.
    Type: Application
    Filed: October 9, 2014
    Publication date: April 16, 2015
    Inventors: Seiji AISO, Hiroshi HASEGAWA, Mitsuhiro INAZUMI, Nobuhiro KARITO
  • Publication number: 20150105908
    Abstract: Systems and methods for providing precise robotic operations without the need for special or task-specific components utilize, in one implementation, a spatial adjustment system, physically separate from the robotic manipulator, supports the target workpiece and works in concert with the robotic manipulator to perform tasks with high spatial precision.
    Type: Application
    Filed: October 10, 2014
    Publication date: April 16, 2015
    Inventors: Yuri A. Ivanov, Rodney Brooks
  • Patent number: 9008757
    Abstract: Systems and methods that utilize optical sensors and non-optical sensors to determine the position and/or orientation of objects. A navigation system includes an optical sensor for receiving optical signals from markers and a non-optical sensor, such as a gyroscope, for generating non-optical data. A navigation computer determines positions and/or orientations of objects based on optical and non-optical data.
    Type: Grant
    Filed: September 24, 2013
    Date of Patent: April 14, 2015
    Assignee: Stryker Corporation
    Inventor: Chunwu Wu
  • Publication number: 20150100461
    Abstract: Mobile robotic system allows multiple users to visit authentic places without physically being there. Users with variable requirements are able to take part in controlling a single controllable device simultaneously; users take part in controlling robot's movement according to their interest. A system administrator selects and defines criteria for robot's movement; the mobile robot with video and audio devices on it is remotely controlled by a server which selects the robot's movement according to the users and system administrator criteria. The server provides information to users; the robot's location influences the content of the information. Such robotic system may be used for shopping, visiting museums and other public touristic attractions over the Internet.
    Type: Application
    Filed: October 4, 2013
    Publication date: April 9, 2015
    Inventors: Dan Baryakar, Andreea Baryakar
  • Publication number: 20150100157
    Abstract: A humanoid robot is provided, the robot being capable of holding a dialog with at least one user, the dialog using two modes of voice recognition, one open and the other closed, the closed mode being defined by a concept characterizing a dialog sequence. The dialog may also be influenced by events that are neither speech nor a text. The robot is capable of executing behaviors and generating expressions and emotions. It has the advantage of considerably reducing programming time and latency of execution of dialog sequences, providing a fluency and naturalness close to human dialogs.
    Type: Application
    Filed: April 3, 2013
    Publication date: April 9, 2015
    Inventors: David Houssin, Gwennael Gate
  • Publication number: 20150097084
    Abstract: A spacecraft system and method includes a platform with a dock and an umbilical payout device. A robot is connected to an umbilical paid out by the umbilical payout device and is repeatedly deployable from the dock. The robot includes one or more imagers, an inertial measurement unit, and a plurality of thrusters. A command module receives image data from the one or more robot imagers and orientation data from the inertial measurement unit. An object recognition module is configured to recognize one or more objects from the received image data. The command module determines the robot's orientation with respect to an object and issues thruster control commands to control movement of the robot based on the robot's orientation. The combination of the space platform and robot on umbilical line can be used for towing another object to different orbital location, inspection including self-inspection of the robot carrying platform and for robotic servicing.
    Type: Application
    Filed: October 3, 2014
    Publication date: April 9, 2015
    Inventors: James J. Szabo, Vladimir Hruby, Thomas R. Roy, Craig DeLuccia, Jay Andrews, W. Dan Williams, Matthew R. Berlin, Jesse V. Gray
  • Patent number: 9002515
    Abstract: The present embodiments relate to a monitoring system for a medical device, wherein the medical device comprises a robot and an image recording part which can be moved by the robot. Provision is made for a radiation source which is attached to the medical device, and for a radiation receiver which is situated remotely from the medical device and is for receiving radiation that is emitted from the radiation source. A comparison entity compares the point of impact of radiation on the radiation receiver with one or more predetermined points of impact of radiation on the radiation receiver. The invention further relates to a corresponding method for monitoring a medical device.
    Type: Grant
    Filed: January 22, 2010
    Date of Patent: April 7, 2015
    Assignee: Siemens Aktiengesellschaft
    Inventors: Oliver Hornung, Donal Medlar
  • Patent number: 9002098
    Abstract: Described is a robotic visual perception system for determining a position and pose of a three-dimensional object. The system receives an external input to select an object of interest. The system also receives visual input from a sensor of a robotic controller that senses the object of interest. Rotation-invariant shape features and appearance are extracted from the sensed object of interest and a set of object templates. A match is identified between the sensed object of interest and an object template using shape features. The match between the sensed object of interest and the object template is confirmed using appearance features. The sensed object is then identified, and a three-dimensional pose of the sensed object of interest is determined. Based on the determined three-dimensional pose of the sensed object, the robotic controller is used to grasp and manipulate the sensed object of interest.
    Type: Grant
    Filed: December 19, 2012
    Date of Patent: April 7, 2015
    Assignee: HRL Laboratories, LLC
    Inventors: Suhas E. Chelian, Rashmi N. Sundareswara, Heiko Hoffmann
  • Publication number: 20150094851
    Abstract: A robot control system detects a position and a direction of each user by a plurality of range image sensors provided in an exhibition hall. A central controller records an inspection action after a user attends the exhibition hall until the user leaves to generate an inspection action table. When the user attends again, the central controller reads a history of inspection action from the inspection action table. Then, the central controller chooses from an utterance content table an utterance content containing a phrase that mentions the inspection action included in the history at a time of last time attendance, determines the utterance content, and makes a robot output the determined utterance content to the user.
    Type: Application
    Filed: September 26, 2014
    Publication date: April 2, 2015
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Koji Kawabe, Taro Yokoyama, Takayuki Kanda, Satoru Satake, Takamasa Iio, Kotaro Hayashi, Florent Ferreri
  • Publication number: 20150094856
    Abstract: A robotic control method for a camera (30) having an optical view and a robot (40) having an end-effector (42) and one or more joints (41) for maneuvering end-effector (42). The robotic control method involves an acquisition of a digital video frame (32) illustrating an image as optically viewed by the camera (30), and an execution of a visual servoing for controlling a pose of end-effector (42) relative to an image feature within the digital video frame (32).
    Type: Application
    Filed: December 18, 2014
    Publication date: April 2, 2015
    Inventors: ALEKSANDRA POPOVIC, PAUL THIENPHRAPA
  • Publication number: 20150089880
    Abstract: A method of manufacturing PDC drill bits includes inspecting a plurality of cutters. The method further includes inspecting a plurality of pockets of a bit body. A cutter of the plurality of cutters is assigned to a pocket of the plurality of pockets based on the inspection of the plurality of cutters and the inspection of the plurality of cutter pockets. A robot positions the cutter inside the pocket and applies heat to a brazing material to produce a molten brazing material within the pocket.
    Type: Application
    Filed: September 27, 2013
    Publication date: April 2, 2015
    Applicant: VAREL EUROPE S.A.S.
    Inventors: Alfazazi Dourfaye, Gary M. Thigpen
  • Publication number: 20150094854
    Abstract: A system including a mobile telepresence robot, a to telepresence computing device in wireless communication with the robot, and a host computing device in wireless communication with the robot and the telepresence computing device. The host computing device relays User Datagram Protocol traffic between the robot and the telepresence computing device through a firewall.
    Type: Application
    Filed: October 13, 2014
    Publication date: April 2, 2015
    Applicant: iRobot Corporation
    Inventors: Mathew Cross, Tony L. Campbell
  • Publication number: 20150094855
    Abstract: The present invention concerns an imitation learning method for a multi-axis manipulator (7,7?). This method comprises the steps of capturing, at a set of successive waypoints (10,11) in a teach-in trajectory (4) of a user-operated training tool, spatial data comprising position and orientation of the training tool (3) in a Cartesian space; selecting, from among said set of successive waypoints (10,11), a subset of waypoints (11) starting from a first waypoint (11) of said set of successive waypoints (10,11), wherein for each subsequent waypoint (11) to be selected a difference in position and/or orientation with respect to a last previously selected waypoint (11) exceeds a predetermined threshold; fitting a set trajectory (4?) in said Cartesian space to said selected subset of waypoints (11); and converting said set trajectory into motion commands in a joint space of said multi-axis manipulator (7,7?).
    Type: Application
    Filed: May 3, 2013
    Publication date: April 2, 2015
    Inventors: Jérôme Chemouny, Stéphane Clerambault, Samuel Pinault
  • Patent number: 8996175
    Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
  • Patent number: 8996174
    Abstract: In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Rodney Brooks, Bruce Blumberg, Noelle Dye, Paula Long
  • Patent number: 8996167
    Abstract: In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions. In some embodiments, the user interface includes a wrist cuff that, when grasped by the user, switches the robot into zero-force gravity-compensated mode.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Natan Linder, Rodney Brooks, Michael Sussman, Bruce Blumberg, Noelle Dye, Michael Caine, Elaine Y. Chen
  • Patent number: 8996228
    Abstract: Methods and systems for construction zone object detection are described. A computing device may be configured to receive, from a LIDAR, a 3D point cloud of a road on which a vehicle is travelling. The 3D point cloud may comprise points corresponding to light reflected from objects on the road. Also, the computing device may be configured to determine sets of points in the 3D point cloud representing an area within a threshold distance from a surface of the road. Further, the computing device may be configured to identify construction zone objects in the sets of points. Further, the computing device may be configured to determine a likelihood of existence of a construction zone, based on the identification. Based on the likelihood, the computing device may be configured to modify a control strategy of the vehicle; and control the vehicle based on the modified control strategy.
    Type: Grant
    Filed: September 5, 2012
    Date of Patent: March 31, 2015
    Assignee: Google Inc.
    Inventors: David Ian Ferguson, Dirk Haehnel, Ian Mahon
  • Publication number: 20150088310
    Abstract: Devices, systems, and methods for social behavior of a telepresence robot are disclosed herein. A telepresence robot may include a drive system, a control system, an object detection system, and a social behaviors component. The drive system is configured to move the telepresence robot. The control system is configured to control the drive system to drive the telepresence robot around a work area. The object detection system is configured to detect a human in proximity to the telepresence robot. The social behaviors component is configured to provide instructions to the control system to cause the telepresence robot to operate according to a first set of rules when a presence of one or more humans is not detected and operate according to a second set of rules when the presence of one or more humans is detected.
    Type: Application
    Filed: November 21, 2014
    Publication date: March 26, 2015
    Inventors: Marco Pinter, Fuji Lai, Daniel Steven Sanchez, James Ballantyne, David Bjorn Roe, Yulun Wang, Charles S. Jordan, Orjeta Taka, Cheuk Wah Wong
  • Patent number: 8983776
    Abstract: A robotic apparatus for traversing a selected area autonomously that senses orientation relative to “environmental” signals. The robotic apparatus is provided in two models, a master that can record directive and “environmental signal” readings, or that can record received location information, to provide at least one command recorded on a machine-readable medium representing an instruction for traversing an area of interest, and a slave that lacks the recording capability. Both master and slave models can replay recorded commands, and compare the expected orientation from the command with an actual orientation sensed during autonomous operation. If an error exceeding a predetermined value is observed, a corrective action is taken. The robotic apparatus is able to utilize a tool to perform a task.
    Type: Grant
    Filed: April 26, 2007
    Date of Patent: March 17, 2015
    Inventor: Jason A. Dean
  • Publication number: 20150073646
    Abstract: A mobile robot that includes a drive system, a controller in communication with the drive system, and a volumetric point cloud imaging device supported above the drive system at a height of greater than about one feet above the ground and directed to be capable of obtaining a point cloud from a volume of space that includes a floor plane in a direction of movement of the mobile robot. The controller receives point cloud signals from the imaging device and issues drive commands to the drive system based at least in part on the received point cloud signals.
    Type: Application
    Filed: November 14, 2014
    Publication date: March 12, 2015
    Applicant: iRobot Corporation
    Inventors: Michael T. Rosenstein, Chikyung Won, Michael Halloran, Steven V. Shamlian, Mark Chiappetta
  • Publication number: 20150073595
    Abstract: A control apparatus for a master slave robot causes a force information correcting unit to correct force information in accordance with a feature of a target object on a screen from target object information calculated by a target object information calculation unit. An operator can thus apply appropriate force while watching a picture projected on a display to perform a task.
    Type: Application
    Filed: September 2, 2014
    Publication date: March 12, 2015
    Inventors: Yudai FUDABA, Yuko TSUSAKA
  • Publication number: 20150073596
    Abstract: A master slave robot is that receives force presentation according to a picture watched by an operator operating the master slave robot. The control apparatus for the master slave robot causes a force information correcting unit to correct force information in accordance with magnification percentage information acquired by a displayed information acquiring unit such that the force information is increased accordingly as the magnification percentage information is larger. An operator can thus apply appropriate force while watching the picture projected on a display to perform a task.
    Type: Application
    Filed: September 3, 2014
    Publication date: March 12, 2015
    Inventors: Yudai FUDABA, Yuko TSUSAKA