Optical Patents (Class 901/47)
  • Patent number: 11925513
    Abstract: Systems and methods for camera control within surgical robotic systems are provided. One system includes a computing device, multiple robot assemblies, and a surgeon console. Each robot assembly among the multiple robot assemblies includes a robotic arm. A robotic arm of a first robot assembly is coupled to an image capture device. Robotic arms of at least a subset of robot assemblies, different from the first robot assembly, are coupled to surgical instruments. The surgeon console includes multiple handles, each communicatively coupled to a robot assembly coupled to a surgical instrument. The surgeon console is configured to transmit to the computing device one or more packets that include data related to a movement of at least one handle. The computing device configured to calculate a new position of the image capture device and transmit instructions to the first robot assembly to move the image capture device to the new position.
    Type: Grant
    Filed: March 2, 2023
    Date of Patent: March 12, 2024
    Assignee: COVIDIEN LP
    Inventors: William Peine, Albert Dvornik
  • Patent number: 11922645
    Abstract: Disclosed is a system and method for operating an imaging system. The imaging system may move or be moved to acquire image data of a subject at different positions relative to the subject. The image data may, thereafter, be combined to form a single image.
    Type: Grant
    Filed: March 18, 2021
    Date of Patent: March 5, 2024
    Assignee: Medtronic Navigation, Inc.
    Inventors: Xavier Tomas Fernandez, Andre Souza, Robert A. Simpson, Kyo C. Jin, Hong Li, Xiaodong Tao, Patrick A. Helm, Michael P. Marrama
  • Patent number: 11904481
    Abstract: A remote control system includes: an imaging unit that shoots an environment in which a device to be operated including an end effector is located; a recognition unit that recognizes objects that can be grasped by the end effector based on a shot image of the environment shot by the imaging unit; an operation terminal that displays the shot image and receive handwritten input information input to the displayed shot image; and an estimation unit that, based on the objects that can be grasped and the handwritten input information input to the shot image, estimates an object to be grasped which has been requested to be grasped by the end effector from among the objects that can be grasped and estimates a way of performing a grasping motion by the end effector, the grasping motion having been requested to be performed with regard to the object to be grasped.
    Type: Grant
    Filed: November 3, 2020
    Date of Patent: February 20, 2024
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventor: Takashi Yamamoto
  • Patent number: 11897144
    Abstract: A lab system accesses a first protocol for performance by a first robot in a first lab. The first protocol includes a set of steps, each associated with an operation, reagent, and equipment. For each of one or more steps, the lab system modifies the step by: (1) identifying one or more replacement operations that achieve an equivalent or substantially similar result as a performance of the operation, (2) identifying replacement equipment that operates substantially similarly to the equipment, and/or (3) identifying one or more replacement reagents that, when substituted for the reagent, do not substantially affect the performance of the step. The lab system generates a modified protocol by replacing one or more of the set of steps with the modified steps. The lab system selects a second lab including a second and configures the second robot to perform the modified protocol in the second lab.
    Type: Grant
    Filed: August 2, 2021
    Date of Patent: February 13, 2024
    Assignee: Artificial, Inc.
    Inventors: Jeff Washington, Geoffrey J. Budd, Nikhita Singh, Jake Sganga, Alexander Li Honda
  • Patent number: 11787061
    Abstract: A method of operating a mobile robot includes displaying, on a display unit, a photographing menu including a first photographing item for allowing the mobile robot to perform a specific motion for photographing and a second photographing item for allowing the mobile robot to photograph a user, displaying a screen for guiding a motion setting of the mobile robot on the display unit when the first photographing item is selected, performing, by the mobile robot, a corresponding motion based on an input motion setting for a first reference time when the motion setting of the mobile robot is input, and displaying a result screen on the display unit after the first reference time has elapsed.
    Type: Grant
    Filed: May 5, 2022
    Date of Patent: October 17, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Junhee Yeo, Seunghee Kim, Yongjae Kim
  • Patent number: 11763923
    Abstract: Systems and methods for enabling determination and notification of an omitted event in a surgical procedure are disclosed. A system may include at least one processor configured to implement a method including accessing frames of video captured during a specific surgical procedure and accessing stored data identifying a recommended sequence of events for the surgical procedure. The method may include comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure. The method may include determining a name of an intraoperative surgical event associated with the deviation and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation.
    Type: Grant
    Filed: May 31, 2022
    Date of Patent: September 19, 2023
    Assignee: THEATOR INC.
    Inventors: Tamir Wolf, Dotan Asselmann
  • Patent number: 11732440
    Abstract: Provided is a system which can supply appropriate information to one operator from a viewpoint that the one operator grasps a remotely operated action of a working machine by another operator. In a first remote operation apparatus 10, a sight line detector 112 detects a sight line of a first operator, and transmits sight line detection data corresponding to the sight line of the first operator. In a second remote operation apparatus 20, a second image output device 221 displays a designated image region spreading with reference to the sight line of the first operator corresponding to the sight line detection data in an environmental image corresponding to captured image data acquired by an imaging device 401 of a working machine 40, in a form different from a form of an image region around the designated image region.
    Type: Grant
    Filed: December 2, 2019
    Date of Patent: August 22, 2023
    Assignee: Kobelco Construction Machinery Co., Ltd.
    Inventors: Masaki Otani, Hitoshi Sasaki, Seiji Saiki, Yoichiro Yamazaki
  • Patent number: 11623400
    Abstract: A modular light for removably attaching to a bio-printer robot end effector, where the light includes: an annular modular light ring housing with an annular opening for receiving the end effector of the bioprinting robot; the housing substantially surrounding a dispensing tip of the end effector; a power supply interface to receive electrical power from the end effector; a plurality of LEDs positioned annularly around the end effector within the annular modular light ring housing, where the plurality of LEDs are spaced in at least two annular rows, where each of the at least two annular rows are at a unique elevational position within the annular modular light ring housing with respect to a light output plane of the annular modular light ring housing; the LEDs are in electrical communication with the power supply interface; and a controller communicatively coupled with the LEDs and the power supply interface.
    Type: Grant
    Filed: February 3, 2021
    Date of Patent: April 11, 2023
    Assignee: ADVANCED SOLUTIONS LIFE SCIENCES, LLC
    Inventors: Scott Cambron, Andrew Davis Blum
  • Patent number: 11440179
    Abstract: A system for robot teaching based on RGB-D images and a teach pendant, including an RGB-D camera, a host computer, a posture teach pendant, and an AR teaching system which includes an AR registration card, an AR module, a virtual robot model, a path planning unit and a posture teaching unit. The RGB-D camera collects RGB images and depth images of a physical working environment in real time. In the path planning unit, path points of a robot end effector are selected, and a 3D coordinates of the path points in the basic coordinate system of the virtual robot model are calculated; the posture teaching unit records the received posture data as the postures of a path point where the virtual robot model is located, so that the virtual robot model is driven to move according to the postures and positions of the path points, thereby completing the robot teaching.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: September 13, 2022
    Assignee: QINGDAO UNIVERSITY OF TECHNOLOGY
    Inventors: Chengjun Chen, Yong Pan, Dongnian Li, Jun Hong
  • Patent number: 11399470
    Abstract: An electro-mechanical device cuts plant stems at a desired angle after a stem is inserted into a stem shaft and a sensor triggers activation of a blade along a blade path at a desired angle. A portable device powered by batteries can be used with plantings where they are growing or a powered device can be used for line production or high volume work.
    Type: Grant
    Filed: April 3, 2021
    Date of Patent: August 2, 2022
    Inventor: Robert Kaleck
  • Patent number: 9576832
    Abstract: An article transport vehicle is provided in which it is difficult for the transported article support portions to interfere with the supported portion of the transported article while reducing transmission of vibrations to the transported article, during a travel of the travel member or during a vertical movement of the vertically movable portion. A support unit which is vertically moved relative to a travel member is provided with a guiding supporting portion for guiding and supporting transported article support portions for supporting a supported portion of a transported article such that the transported article support portions can be moved to support positions and to retracted positions, and a damper element which is located between a vertically movable portion and the guiding supporting portion and receives load from the guiding supporting portion and which is elastically deformable in a vertical direction.
    Type: Grant
    Filed: November 10, 2015
    Date of Patent: February 21, 2017
    Assignee: Daifuku Co., Ltd.
    Inventor: Daichi Tomida
  • Patent number: 9043016
    Abstract: Certain embodiments of the present invention provide robotic control modules for use in a robotic control system of a vehicle, including structures, systems and methods, that can provide (i) a robotic control module that has multiple functional circuits, such as a processor and accompanying circuits, an actuator controller, an actuator amplifier, a packet network switch, and a power supply integrated into a mountable and/or stackable package/housing; (ii) a robotic control module with the noted complement of circuits that is configured to reduce heat, reduce space, shield sensitive components from electro-magnetic noise; (iii) a robotic control system utilizing robotic control modules that include the sufficiently interchangeable functionality allowing for interchangeability of modules; and (iv) a robotic control system that distributes the functionality and processing among a plurality of robotic control modules in a vehicle.
    Type: Grant
    Filed: October 20, 2006
    Date of Patent: May 26, 2015
    Assignees: Deere & Company, iRobot Corporation
    Inventors: Mikhail O. Filippov, Osa Fitch, Scott P. Keller, John O'Connor, David S. Zendzian, Nadim El Fata, Kevin Larsen, Arlen Eugene Meuchel, Mark David Schmaltz, James Allard, Chris A. De Roo, William Robert Norris, Andrew Julian Norby, Christopher David Glenn Turner
  • Patent number: 9039685
    Abstract: A surgical instrument is provided, including: at least one articulatable arm having a distal end, a proximal end, and at least one joint region disposed between the distal and proximal ends; an optical fiber bend sensor provided in the at least one joint region of the at least one articulatable arm; a detection system coupled to the optical fiber bend sensor, said detection system comprising a light source and a light detector for detecting light reflected by or transmitted through the optical fiber bend sensor to determine a position of at least one joint region of the at least one articulatable arm based on the detected light reflected by or transmitted through the optical fiber bend sensor; and a control system comprising a servo controller for effectuating movement of the arm.
    Type: Grant
    Filed: March 16, 2011
    Date of Patent: May 26, 2015
    Assignee: INTUITIVE SURGICAL OPERATIONS, INC.
    Inventors: David Q. Larkin, David C. Shafer
  • Publication number: 20150142171
    Abstract: Disclosed are methods adapted to calibrate a robot gripper to a camera. The method includes providing a robot with a coupled moveable gripper, providing one or more cameras, providing a target scene having one or more fixed target points, moving the gripper and capturing images of the target scene at two or more imaging locations, recording positions in the gripper coordinate system for each of the imaging locations, recording images in a camera coordinate system, and processing the images and positions to determine a gripper-to-camera transformation between the gripper coordinate system and the camera coordinate system. The transformation may be accomplished by nonlinear least-squares minimization, such as the Levenberg-Marquardt method. Robot calibration apparatus for carrying out the method are disclosed, as are other aspects.
    Type: Application
    Filed: August 10, 2012
    Publication date: May 21, 2015
    Applicant: SIEMENS HEALTHCARE DIAGNOSTICS INC.
    Inventors: Gang Li, Yakup Genc, Siddharth Chhatpar, Daniel Sacco, Sandeep Naik, Alexander Gelbman, Roy Barr
  • Publication number: 20150135459
    Abstract: Provided is a blade maintenance device for a wind turbine. The blade maintenance device for the wind turbine includes: a body that travels along a leading edge of a blade; support units that extend from the body to both sides of the blade and support the sides of the blade; and a maintenance unit installed at at least one of the body and the support units so as to perform maintenance of an outer side of the blade.
    Type: Application
    Filed: August 22, 2012
    Publication date: May 21, 2015
    Applicant: SAMSUNG HEAVY IND. CO., LTD.
    Inventors: Byung Kyu Lee, Hong Gyeoum Kim, Jong Hwan Lee, Young Seok Cho, Young Youl Ha, In Chul Ha, Dong Ki Han
  • Publication number: 20150142252
    Abstract: A robotic device includes a housing configured to house a mobile device. The robotic device also includes an articulating image director aligned with a field of view of a camera of the mobile device. The housing of the robotic device is positioned at an angle to provide a forward view or rear facing view to the camera via the articulating image director.
    Type: Application
    Filed: July 31, 2014
    Publication date: May 21, 2015
    Inventor: Donald Bolden HUTSON
  • Patent number: 9037336
    Abstract: A robot system includes a planar sign, a robot, a distance direction sensor, and a controller. The controller is configured to control the robot and includes a map data memory and a progress direction determining device. The map data memory is configured to store map data of a predetermined running path including a position of the planar sign. The progress direction determining device is configured to compare a detection result of the distance direction sensor and the stored map data so as to determine a progress direction of the robot.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: May 19, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Dai Kouno, Tamio Nakamura
  • Patent number: 9037296
    Abstract: Disclosed are a robot cleaner, and a system and method for remotely controlling the robot cleaner. Since the robot cleaner is accessed to a terminal through a network, the robot cleaner can be controlled in more various manners. A situation inside a house can be real-time checked from the outside, and the satiation can be rapidly handled according to a state of the robot cleaner. Since the terminal and the robot cleaner are connected to each other through a network, a voice call function and an interphone function can be implemented. Furthermore, the robot cleaner can perform automatic cleaning, manual cleaning, and reservation cleaning in an autonomous manner, or in a remote-controlled manner.
    Type: Grant
    Filed: September 6, 2012
    Date of Patent: May 19, 2015
    Assignee: LG Electronics Inc.
    Inventors: Suuk Choe, Sunghun Lee, Junho Jang, Hyunwoong Park, Seungmin Baek, Yiebin Kim
  • Patent number: 9032811
    Abstract: A robot apparatus includes an arm that includes an outer skin and a detector that detects the deformation of the outer skin. The detector includes a sending unit that sends a signal, a receiving unit that receives the signal, and a transmission route that is provided along the outer skin so as to lead the signal. The detector detects the deformation of the outer skin based on whether a signal reaches the receiving unit.
    Type: Grant
    Filed: January 21, 2014
    Date of Patent: May 19, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Nobukazu Miyauchi, Tamio Nakamura, Zenta Nakamoto, Dai Kouno
  • Publication number: 20150134115
    Abstract: A method of operating a robot includes receiving image data from an image capture device of the robot. The image data is representative of a glyph viewed by the image capture device on the display of a computing device within a field of view of the image capture device. The method further includes determining, at a controller, a command message based on the glyph represented in the image data and issuing a command to at least one resource or component of the robot based on the command message.
    Type: Application
    Filed: November 12, 2013
    Publication date: May 14, 2015
    Applicant: iRobot Corporation
    Inventors: Sherman Gong, Rogelio Neumann
  • Publication number: 20150131896
    Abstract: A safety monitoring system for human-machine symbiosis is provided, including a spatial image capturing unit, an image recognition unit, a human-robot-interaction safety monitoring unit, and a process monitoring unit. The spatial image capturing unit, disposed in a working area, acquires at least two skeleton images. The image recognition unit generates at least two spatial gesture images corresponding to the at least two skeleton images, based on information of changes in position of the at least two skeleton images with respect to time. The human-robot-interaction safety monitoring unit generates a gesture distribution based on the at least two spatial gesture images and a safety distance. The process monitoring unit determines whether the gesture distribution meets a safety criterion.
    Type: Application
    Filed: March 25, 2014
    Publication date: May 14, 2015
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Jhen-Jia Hu, Hau-Wei Wang, Chung-Ning Huang
  • Publication number: 20150134079
    Abstract: Provided are a walk-assistive robot and a method of controlling the same. The method of controlling the walk-assistive robot includes: obtaining ground information that is information regarding ground a walking direction; determining control patterns of the walk-assistive robot by analyzing the obtained ground information; and controlling the walk-assistive robot based on the determined control patterns.
    Type: Application
    Filed: August 28, 2014
    Publication date: May 14, 2015
    Inventors: Suk June YOON, Kyung Shik ROH, Young Bo SHIM, Young Do KWON, Sung Hwan AHN, Hyo Seok HWANG
  • Patent number: 9031697
    Abstract: The present teachings provide a method of controlling a remote vehicle having an end effector and an image sensing device. The method includes obtaining an image of an object with the image sensing device, determining a ray from a focal point of the image to the object based on the obtained image, positioning the end effector of the remote vehicle to align with the determined ray, and moving the end effector along the determined ray to approach the object.
    Type: Grant
    Filed: April 16, 2012
    Date of Patent: May 12, 2015
    Assignee: iRobot Corporation
    Inventors: Wesley Hanan Huang, Emilie A. Phillips
  • Publication number: 20150122183
    Abstract: A method and system to compensate for stray light errors in time of flight (TOF) camera systems uses reference targets in the in the field of view (FOV) that can be used to measure stray light. In different embodiments, one or more reference targets are used.
    Type: Application
    Filed: January 9, 2015
    Publication date: May 7, 2015
    Inventor: Thierry Oggier
  • Publication number: 20150124242
    Abstract: A scanning optical range finder in a mobile robot includes an optical emitter circuit, a non-imaging optical element, an optical detector circuit, and a ranging circuit. The non-imaging optical element is arranged to receive optical signals at an entrance aperture thereof responsive to operation of the optical emitter circuit, and to direct the optical signals to an output aperture thereof. The optical detector circuit is configured to receive the optical signals from the output aperture of the non-imaging optical element, and to generate detection signals based on respective phase differences of the optical signals relative to corresponding outputs of the optical emitter circuit. The ranging circuit is configured to calculate a range of a target from the phase differences indicated by the detection signals. Related devices and methods of operation are also discussed.
    Type: Application
    Filed: October 31, 2014
    Publication date: May 7, 2015
    Inventors: Travis Pierce, Jamie Milliken, Marc Wilga
  • Publication number: 20150127160
    Abstract: A robot includes, a holding unit configured to hold an object, an image pickup unit, and a predetermined first portion of the robot. The image pickup unit picks up images of the holding unit and the object using the first portion as a background.
    Type: Application
    Filed: November 4, 2014
    Publication date: May 7, 2015
    Inventor: Yukihiro YAMAGUCHI
  • Publication number: 20150125035
    Abstract: To perform robust position and orientation measurement even in a situation where noise exist, an image including a target object is obtained, an approximate position and orientation of the target object included in the obtained image are obtained, information related to a shadow region of the target object in the obtained image is estimated, the approximate position and orientation are corrected on the basis of the estimated information related to the shadow region, and a position and orientation of the target object in the image are derived on the basis of the corrected approximate position and orientation and held model information.
    Type: Application
    Filed: November 3, 2014
    Publication date: May 7, 2015
    Inventors: Sonoko Miyatani, Kazuhiko Kobayashi
  • Publication number: 20150127156
    Abstract: A remote controlled robot with a head that supports a monitor and is coupled to a mobile platform. The mobile robot also includes an auxiliary camera coupled to the mobile platform by a boom. The mobile robot is controlled by a remote control station. By way of example, the robot can be remotely moved about an operating room. The auxiliary camera extends from the boom so that it provides a relatively close view of a patient or other item in the room. An assistant in the operating room may move the boom and the camera. The boom may be connected to a robot head that can be remotely moved by the remote control station.
    Type: Application
    Filed: January 12, 2015
    Publication date: May 7, 2015
    Inventors: Yulun Wang, Charles S. Jordan, Kevin Hanrahan, Daniel Sanchez, Marco Pinter
  • Publication number: 20150114236
    Abstract: A robotic cooking apparatus that can cook dishes using a computer recipe provided by a user and ingredients stored in it. Able to adjust the recipe based on user preference data. Cooking apparatus comprises of ingredient input section, cooking stations, pre-processing and post-processing station. Input section has trays and can be at room temperature or refrigerated for storing perishable ingredients. Robot head assembly is able to transfer ingredients, identify and measure as well as take inventory periodically. Cooking apparatus is connected to a network and can be operated remotely by a fixed or handheld device and monitor its operation remotely.
    Type: Application
    Filed: June 2, 2011
    Publication date: April 30, 2015
    Inventors: Shambhu Nath Roy, Riya Bhattacharya
  • Publication number: 20150120055
    Abstract: An image acquisition unit acquires an image including an object, and a controller starts a visual servo using the acquired image, on the basis of at least one of an error in calibration, an error in installation of a robot, an error resulting from the rigidity of the robot, an error of a position where the robot has gripped the object, an error regarding imaging, and an error regarding a work environment. Additionally, the controller starts the visual servo when the distance between one point of a working unit of the robot and the object is equal to or greater than 2 mm.
    Type: Application
    Filed: October 29, 2014
    Publication date: April 30, 2015
    Inventors: Hiroshi MIYAZAWA, Yukihiro YAMAGUCHI, Nobuhiro KARITO
  • Publication number: 20150120127
    Abstract: A mobile unit according to an embodiment includes a main body, a moving mechanism, a sensor, a recognizer, a first movement adjuster, a second landmark recognizer, and a second movement adjuster. The moving mechanism moves the main body. The sensor detects a distance and a direction to an object around the main body. The recognizer recognizes a landmark based on a detection result of the sensor. The first movement adjuster controls the moving mechanism such that the main body is moved to a target position based on the landmark. If the distance to the landmark has become smaller than a first threshold, the second landmark recognizer recognizes a second landmark. The second movement adjuster controls the moving mechanism such that the main body is moved to the target position based on the second landmark.
    Type: Application
    Filed: October 29, 2014
    Publication date: April 30, 2015
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Taku SHIKINA, Tamio NAKAMURA, Dai KOUNO, Takashi NISHIMURA
  • Publication number: 20150120057
    Abstract: A mobile robot including a robot body, a drive system supporting the robot body, and a controller in communication with the drive system. The robot also includes an actuator moving a portion of the robot body through a volume of space adjacent the mobile robot and a sensor pod in communication with the controller. The sensor pod includes a collar rotatably supported and having a curved wall formed at least partially as a surface of revolution about a vertical axis. The sensor pod also includes a volumetric point cloud sensor housed by the collar and observing the volume of space adjacent the robot from within the collar along an observation axis extending through the curved wall. A collar actuator rotates the collar and the volumetric point cloud sensor together about the collar axis.
    Type: Application
    Filed: December 30, 2014
    Publication date: April 30, 2015
    Applicant: iRobot Corporation
    Inventors: Cheuk Wah Wong, Eben Rauhut, Brian C. Benson, JR., Peter J. Lydon, Michael T. Rosenstein, Michael Halloran, Steven V. Shamlian, Chikyung Won, Mark Chiappetta, Justin H. Kearns, Orjeta Taka, Robert Todd Pack, Timothy S. Farlow, Jasper Fourways Vicenti
  • Publication number: 20150120047
    Abstract: A control device includes a reception unit that receives first operation information and second operation information different from the first operation information; and a process unit that instructs a robot to execute operations based on the first operation information and the second operation information using a plurality of captured images of an imaged target object, the images being captured multiple times while the robot moves from a first posture to a second posture different from the first posture.
    Type: Application
    Filed: October 22, 2014
    Publication date: April 30, 2015
    Inventors: Masaki MOTOYOSHI, Kenji ONDA, Hiroyuki KAWADA, Mitsuhiro INAZUMI
  • Publication number: 20150107915
    Abstract: A low profile stepper robot is described and claimed herein. The robot includes a plurality of foot assemblies. Each foot assembly includes a suction cup, a vacuum generator, and a valve, with the vacuum generator being operationally connected to the suction cup. A conduit connects a source of operational fluid flow to the vacuum generators, and the valves allow or prevent fluid flow to the vacuum generators. Actuators are positioned between the foot assemblies and the robot base. The actuators provide for linear and rotational displacement of the foot assemblies, allowing the robot to walk and turn along an inspection surface.
    Type: Application
    Filed: October 21, 2014
    Publication date: April 23, 2015
    Inventors: S. William Glass, III, Bradley A. Thigpen, Robert A. Furter
  • Publication number: 20150112482
    Abstract: A teaching system includes an image generating unit, a projecting unit, a work line generating unit, an arithmetic unit, and a job generating unit. The image generating unit generates a virtual image including a robot and a workpiece having a processed surface to be processed by the robot. The projecting unit generates a projection plane orthogonal to a normal direction of a desired point on the processed surface selected on the virtual image and projects the processed surface onto the projection plane. The work line generating unit generates a work line for the robot based on setting contents received via the projection plane. The arithmetic unit calculates a teaching value including the position and the posture of the robot at each point of the target points. The job generating unit generates a job program for operating the robot in an actual configuration based on the teaching value.
    Type: Application
    Filed: September 12, 2014
    Publication date: April 23, 2015
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventor: Koichi KUWAHARA
  • Patent number: 9010465
    Abstract: Devices, systems, and methods for inspecting and objectively analyzing the condition of a roof are presented. A vehicle adapted for traversing and inspecting an irregular terrain includes a chassis having a bottom surface that defines a higher ground clearance at an intermediate location, thereby keeping the center of mass low when crossing roof peaks. In another embodiment, the drive tracks include a partially collapsible treads made of resilient foam. A system for inspecting a roof includes a lift system and a remote computer for analyzing data. Vehicles and systems may gather and analyze data, and generate revenue by providing data, analysis, and reports for a fee to interested parties.
    Type: Grant
    Filed: June 13, 2014
    Date of Patent: April 21, 2015
    Assignee: Tobor Technology, LLC
    Inventors: Michael D. Slawinski, Dennis L. Guthrie
  • Patent number: 9014848
    Abstract: A robot system includes a mobile robot having a controller executing a control system for controlling operation of the robot, a cloud computing service in communication with the controller of the robot, and a remote computing device in communication with the cloud computing service. The remote computing device communicates with the robot through the cloud computing service.
    Type: Grant
    Filed: February 22, 2011
    Date of Patent: April 21, 2015
    Assignee: iRobot Corporation
    Inventors: Timothy S. Farlow, Michael Rosenstein, Michael Halloran, Chikyung Won, Steven V. Shamlian, Mark Chiappetta
  • Publication number: 20150105907
    Abstract: A robot includes a control unit that controls a movable unit of the robot to move an endpoint of the movable unit closer to a target position, and an image acquisition unit that acquires a target image as an image containing the end point when the end point is in the target position, and a current image as an image containing the end point when the end point is in a current position. The control unit controls movement of the movable unit based on the current image and the target image and output from a force detection unit that detects a force acting on the movable unit.
    Type: Application
    Filed: October 9, 2014
    Publication date: April 16, 2015
    Inventors: Seiji AISO, Hiroshi HASEGAWA, Mitsuhiro INAZUMI, Nobuhiro KARITO
  • Publication number: 20150105964
    Abstract: Techniques that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized.
    Type: Application
    Filed: November 17, 2014
    Publication date: April 16, 2015
    Applicant: NEATO ROBOTICS, INC.
    Inventors: Boris SOFMAN, Vladimir ERMAKOV, Mark EMMERICH, Steven ALEXANDER, Nathaniel David MONSON
  • Publication number: 20150105908
    Abstract: Systems and methods for providing precise robotic operations without the need for special or task-specific components utilize, in one implementation, a spatial adjustment system, physically separate from the robotic manipulator, supports the target workpiece and works in concert with the robotic manipulator to perform tasks with high spatial precision.
    Type: Application
    Filed: October 10, 2014
    Publication date: April 16, 2015
    Inventors: Yuri A. Ivanov, Rodney Brooks
  • Patent number: 9008757
    Abstract: Systems and methods that utilize optical sensors and non-optical sensors to determine the position and/or orientation of objects. A navigation system includes an optical sensor for receiving optical signals from markers and a non-optical sensor, such as a gyroscope, for generating non-optical data. A navigation computer determines positions and/or orientations of objects based on optical and non-optical data.
    Type: Grant
    Filed: September 24, 2013
    Date of Patent: April 14, 2015
    Assignee: Stryker Corporation
    Inventor: Chunwu Wu
  • Publication number: 20150100157
    Abstract: A humanoid robot is provided, the robot being capable of holding a dialog with at least one user, the dialog using two modes of voice recognition, one open and the other closed, the closed mode being defined by a concept characterizing a dialog sequence. The dialog may also be influenced by events that are neither speech nor a text. The robot is capable of executing behaviors and generating expressions and emotions. It has the advantage of considerably reducing programming time and latency of execution of dialog sequences, providing a fluency and naturalness close to human dialogs.
    Type: Application
    Filed: April 3, 2013
    Publication date: April 9, 2015
    Inventors: David Houssin, Gwennael Gate
  • Publication number: 20150097084
    Abstract: A spacecraft system and method includes a platform with a dock and an umbilical payout device. A robot is connected to an umbilical paid out by the umbilical payout device and is repeatedly deployable from the dock. The robot includes one or more imagers, an inertial measurement unit, and a plurality of thrusters. A command module receives image data from the one or more robot imagers and orientation data from the inertial measurement unit. An object recognition module is configured to recognize one or more objects from the received image data. The command module determines the robot's orientation with respect to an object and issues thruster control commands to control movement of the robot based on the robot's orientation. The combination of the space platform and robot on umbilical line can be used for towing another object to different orbital location, inspection including self-inspection of the robot carrying platform and for robotic servicing.
    Type: Application
    Filed: October 3, 2014
    Publication date: April 9, 2015
    Inventors: James J. Szabo, Vladimir Hruby, Thomas R. Roy, Craig DeLuccia, Jay Andrews, W. Dan Williams, Matthew R. Berlin, Jesse V. Gray
  • Publication number: 20150100461
    Abstract: Mobile robotic system allows multiple users to visit authentic places without physically being there. Users with variable requirements are able to take part in controlling a single controllable device simultaneously; users take part in controlling robot's movement according to their interest. A system administrator selects and defines criteria for robot's movement; the mobile robot with video and audio devices on it is remotely controlled by a server which selects the robot's movement according to the users and system administrator criteria. The server provides information to users; the robot's location influences the content of the information. Such robotic system may be used for shopping, visiting museums and other public touristic attractions over the Internet.
    Type: Application
    Filed: October 4, 2013
    Publication date: April 9, 2015
    Inventors: Dan Baryakar, Andreea Baryakar
  • Patent number: 9002515
    Abstract: The present embodiments relate to a monitoring system for a medical device, wherein the medical device comprises a robot and an image recording part which can be moved by the robot. Provision is made for a radiation source which is attached to the medical device, and for a radiation receiver which is situated remotely from the medical device and is for receiving radiation that is emitted from the radiation source. A comparison entity compares the point of impact of radiation on the radiation receiver with one or more predetermined points of impact of radiation on the radiation receiver. The invention further relates to a corresponding method for monitoring a medical device.
    Type: Grant
    Filed: January 22, 2010
    Date of Patent: April 7, 2015
    Assignee: Siemens Aktiengesellschaft
    Inventors: Oliver Hornung, Donal Medlar
  • Patent number: 9002098
    Abstract: Described is a robotic visual perception system for determining a position and pose of a three-dimensional object. The system receives an external input to select an object of interest. The system also receives visual input from a sensor of a robotic controller that senses the object of interest. Rotation-invariant shape features and appearance are extracted from the sensed object of interest and a set of object templates. A match is identified between the sensed object of interest and an object template using shape features. The match between the sensed object of interest and the object template is confirmed using appearance features. The sensed object is then identified, and a three-dimensional pose of the sensed object of interest is determined. Based on the determined three-dimensional pose of the sensed object, the robotic controller is used to grasp and manipulate the sensed object of interest.
    Type: Grant
    Filed: December 19, 2012
    Date of Patent: April 7, 2015
    Assignee: HRL Laboratories, LLC
    Inventors: Suhas E. Chelian, Rashmi N. Sundareswara, Heiko Hoffmann
  • Publication number: 20150094855
    Abstract: The present invention concerns an imitation learning method for a multi-axis manipulator (7,7?). This method comprises the steps of capturing, at a set of successive waypoints (10,11) in a teach-in trajectory (4) of a user-operated training tool, spatial data comprising position and orientation of the training tool (3) in a Cartesian space; selecting, from among said set of successive waypoints (10,11), a subset of waypoints (11) starting from a first waypoint (11) of said set of successive waypoints (10,11), wherein for each subsequent waypoint (11) to be selected a difference in position and/or orientation with respect to a last previously selected waypoint (11) exceeds a predetermined threshold; fitting a set trajectory (4?) in said Cartesian space to said selected subset of waypoints (11); and converting said set trajectory into motion commands in a joint space of said multi-axis manipulator (7,7?).
    Type: Application
    Filed: May 3, 2013
    Publication date: April 2, 2015
    Inventors: Jérôme Chemouny, Stéphane Clerambault, Samuel Pinault
  • Publication number: 20150094854
    Abstract: A system including a mobile telepresence robot, a to telepresence computing device in wireless communication with the robot, and a host computing device in wireless communication with the robot and the telepresence computing device. The host computing device relays User Datagram Protocol traffic between the robot and the telepresence computing device through a firewall.
    Type: Application
    Filed: October 13, 2014
    Publication date: April 2, 2015
    Applicant: iRobot Corporation
    Inventors: Mathew Cross, Tony L. Campbell
  • Publication number: 20150089880
    Abstract: A method of manufacturing PDC drill bits includes inspecting a plurality of cutters. The method further includes inspecting a plurality of pockets of a bit body. A cutter of the plurality of cutters is assigned to a pocket of the plurality of pockets based on the inspection of the plurality of cutters and the inspection of the plurality of cutter pockets. A robot positions the cutter inside the pocket and applies heat to a brazing material to produce a molten brazing material within the pocket.
    Type: Application
    Filed: September 27, 2013
    Publication date: April 2, 2015
    Applicant: VAREL EUROPE S.A.S.
    Inventors: Alfazazi Dourfaye, Gary M. Thigpen
  • Publication number: 20150094856
    Abstract: A robotic control method for a camera (30) having an optical view and a robot (40) having an end-effector (42) and one or more joints (41) for maneuvering end-effector (42). The robotic control method involves an acquisition of a digital video frame (32) illustrating an image as optically viewed by the camera (30), and an execution of a visual servoing for controlling a pose of end-effector (42) relative to an image feature within the digital video frame (32).
    Type: Application
    Filed: December 18, 2014
    Publication date: April 2, 2015
    Inventors: ALEKSANDRA POPOVIC, PAUL THIENPHRAPA