Vision Sensor (e.g., Camera, Photocell) Patents (Class 700/259)
  • Patent number: 9918603
    Abstract: A sensor module and a robot cleaner including the sensor module may provide accurate sensing of an obstacle and prevent an erroneous sensing of an obstacle. The robot cleaner may include a body including a cleaning unit to remove foreign substances from a surface of a floor, a cover to cover an upper portion of the body, a sensor module including an obstacle sensor mounted to sense an obstacle, and a sensor window provided at one side of the sensor module. The sensor module may include a light emitting device to emit light through the sensor window, a light receiving reflector on which light reflected from the obstacle is incident, and a light shielding portion interposed between the light emitting device and the light receiving reflector to block the light emitted from the light emitting device from being incident upon the light receiving reflector.
    Type: Grant
    Filed: April 8, 2014
    Date of Patent: March 20, 2018
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Byung Chan Kim, Sang Sik Yoon, Kyong Su Kim, Dong Won Kim, Jea Yun So, Yeon Kyu Jeong
  • Patent number: 9892559
    Abstract: A PC includes: an image data setting section for (i) setting, to background image data indicative of a background image, apparatus image data obtained by capturing an image of an apparatus and serving as referential image data to be referred to for identifying the apparatus, and (ii) setting, to the apparatus image data, dynamic part image data indicative of a dynamic part image positioned on the apparatus image; and an address setting section for (i) associating, with the dynamic part image data, (a) an address for specifying a storage area of a memory in which storage area data to be accessed by a portable terminal device is stored and (b) address substitutive information to be substituted for the address, and (ii) generating address display data to be used to display the address substitutive information instead of the address.
    Type: Grant
    Filed: April 2, 2015
    Date of Patent: February 13, 2018
    Assignee: Digital Electronics Corporation
    Inventors: Minoru Yoshida, Toru Terada
  • Patent number: 9868072
    Abstract: An interactive play device, method and apparatus, is disclosed that includes means for generating a plurality of interactions, input control mechanisms, means for storing responses to interactions, and control means to select the next interaction based on memorized responses. This invention provides a new class of interactive play devices, which is founded on personalizing a play device so that its current functionality is based on how the player has interacted with it in prior playing sessions. The invention also discloses a doll device and a car device, which operate in a plurality of states that mimic human behavior. Further, the specification describes a game during which the player is challenged to transform a play device from an initial state to a desired state by providing appropriate responses to interactions initiated by the device.
    Type: Grant
    Filed: December 7, 2005
    Date of Patent: January 16, 2018
    Assignee: Interactive Play Devices LLC
    Inventor: Nabil N. Ghaly
  • Patent number: 9841271
    Abstract: A three-dimensional measurement apparatus selects points corresponding to geometric features of a three-dimensional shape model of a target object, projects a plurality of selected points corresponding to the geometric features onto a range image based on approximate values indicating the position and orientation of the target object and imaging parameters at the time of imaging of the range image, searches regions of predetermined ranges respectively from the plurality of projected points for geometric features on the range image which correspond to the geometric features of the three-dimensional shape model, and associates these geometric features with each other. The apparatus then calculates the position and orientation of the target object using differences of distances on a three-dimensional space between the geometric features of the three-dimensional shape model and those on the range image, which are associated with each other.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: December 12, 2017
    Assignee: Canon Kabushiki Kaisha
    Inventors: Yusuke Nakazato, Kazuhiko Kobayashi
  • Patent number: 9817395
    Abstract: This disclosure describes, according to some embodiments, a robot network comprising a plurality of robot units capable of providing navigational guidance to patrons requesting navigational assistance. In an example method, upon receiving a request from a first patron, the method selects an available robot unit from among a plurality of robot units comprising the robot network based on an estimated time to reach the first patron's location. The estimated time to reach the first patron's location is determined based on status updates of other of the robot units comprising the robot network. The method further assigns the available robot unit to the first patron, navigates the assigned robot unit to the first patron's location, engages the first patron using the assigned robot unit, and guides the first patron toward the requested destination using the assigned robot unit.
    Type: Grant
    Filed: March 31, 2016
    Date of Patent: November 14, 2017
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventors: Emrah Akin Sisbot, Halit Bener Suay
  • Patent number: 9789612
    Abstract: A method of operating a robot includes electronically receiving images and augmenting the images by overlaying a representation of the robot on the images. The robot representation includes user-selectable portions. The method includes electronically displaying the augmented images and receiving an indication of a selection of at least one user-selectable portion of the robot representation. The method also includes electronically displaying an intent to command the selected at least one user-selectable portion of the robot representation, receiving an input representative of a user interaction with at least one user-selectable portion, and issuing a command to the robot based on the user interaction.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: October 17, 2017
    Assignee: IROBOT DEFENSE HOLDINGS, INC.
    Inventors: Orin P. F. Hoffman, Peter Keefe, Eric Smith, John Wang, Andrew Labrecque, Brett Ponsler, Susan Macchia, Brian J. Madge
  • Patent number: 9764468
    Abstract: Apparatus and methods for training and operating of robotic devices. Robotic controller may comprise a predictor apparatus configured to generate motor control output. The predictor may be operable in accordance with a learning process based on a teaching signal comprising the control output. An adaptive controller block may provide control output that may be combined with the predicted control output. The predictor learning process may be configured to learn the combined control signal. Predictor training may comprise a plurality of trials. During initial trial, the control output may be capable of causing a robot to perform a task. During intermediate trials, individual contributions from the controller block and the predictor may be inadequate for the task. Upon learning, the control knowledge may be transferred to the predictor so as to enable task execution in absence of subsequent inputs from the controller. Control output and/or predictor output may comprise multi-channel signals.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: September 19, 2017
    Assignee: Brain Corporation
    Inventors: Eugene Izhikevich, Oleg Sinyavskiy, Jean-Baptiste Passot
  • Patent number: 9764484
    Abstract: A robot system includes a robot installed on a floor surface, an irradiating unit irradiating visible light onto the floor surface, and a forcible stopping unit forcibly stopping the robot from moving when there occurs an abnormality. The radiating unit is controlled by a control unit such that visible light is radiated onto motion and latent areas on the floor surface. The motion area is an area occupied on the floor surface by a space within which the robot is allowed to move during execution of a task. The latent area is an area occupied on the floor surface by a space within which the robot is likely to move until the robot is forcibly stopped from moving by the forcible stopping unit. The control unit sets the latent area depending on whether or not an execution acceleration of the robot is higher than or equal to a predetermined acceleration.
    Type: Grant
    Filed: July 1, 2016
    Date of Patent: September 19, 2017
    Assignee: DENSO WAVE INCORPORATED
    Inventor: Syu Katayama
  • Patent number: 9740191
    Abstract: Systems and methods for calibrating the location of an end effector-carrying apparatus relative to successive workpieces before the start of a production manufacturing operation. The location calibration is performed using a positioning system. These disclosed methodologies allow an operator to program (or teach) the robot motion path once and reuse that path for subsequent structures by using relative location feedback from a measurement system to adjust the position and orientation offset of the robot relative to the workpiece. When each subsequent workpiece comes into the robotic workcell, its location (i.e., position and orientation) relative to the robot may be different than the first workpiece that was used when developing the initial program. The disclosed systems and methods can also be used to compensate for structural differences between workpieces intended to have identical structures.
    Type: Grant
    Filed: February 12, 2015
    Date of Patent: August 22, 2017
    Assignee: The Boeing Company
    Inventors: James J. Troy, Barry A. Fetzer, Scott W. Lea, Gary E. Georgeson
  • Patent number: 9741108
    Abstract: An image processing apparatus, connected to an imaging part to capture an image of workpieces conveyed on a conveyer, includes an interface that receives a signal indicating a travel distance of the conveyer, an interface that communicates with a control device for controlling a moving machine disposed downstream of an imaging area of a imaging part, a positional information acquisition unit that processes the image captured by the imaging part and thereby acquiring positional information of a pre-registered workpiece in the image, a travel distance obtaining unit that obtains the travel distance of the conveyer synchronized with the control device, an initiating unit that initiates the capturing by the imaging part in response to an imaging command, and a transmission unit that transmits, to the control device, the positional information and the travel distance upon the capturing of the image used to acquire the positional information.
    Type: Grant
    Filed: August 2, 2013
    Date of Patent: August 22, 2017
    Assignee: OMRON Corporation
    Inventors: Yasuyuki Ikeda, Yuichi Doi, Naoya Nakashita
  • Patent number: 9678654
    Abstract: A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.
    Type: Grant
    Filed: December 19, 2014
    Date of Patent: June 13, 2017
    Assignee: Google Inc.
    Inventors: Adrian Wong, Xiaoyu Miao
  • Patent number: 9659363
    Abstract: A positioning apparatus includes: a calculation unit that calculates an amount of deviation between a position of a feature point of a reference workpiece and a feature point of a workpiece by comparing a relative position of an imaging unit with respect to a table when the workpiece is imaged by the imaging unit with a reference relative position, and comparing a position of a feature point of the workpiece in the image of the workpiece imaged by the imaging unit with a reference point image position; and a program changing unit that generates a correction amount such that the amount of deviation calculated by the calculation unit becomes zero, and thereby changes a program of the machine tool.
    Type: Grant
    Filed: February 8, 2016
    Date of Patent: May 23, 2017
    Assignee: FANUC CORPORATION
    Inventor: Kenichi Ogawa
  • Patent number: 9616573
    Abstract: An end effector can be removed to a target position and calibration loads can be reduced, even if there is an error in a kinematic operation in a robot main body and/or cameras. A positional deviation integral torque obtained by integrating a value corresponding to a positional deviation is applied to joints while being superimposed with torques based on angular deviations. If movements of the joints stop or are about to stop before the end effector reaches a target position due to an error in a kinematic operation, the positional deviation integral torque increases with time, to move said joints and move the end effector to the target position. Thus, the end effector can be reliably moved to the target position by the positional deviation integral torque, and calibration loads can be reduced.
    Type: Grant
    Filed: May 23, 2013
    Date of Patent: April 11, 2017
    Assignee: THE RITSUMEIKAN TRUST
    Inventors: Sadao Kawamura, Hiroyuki Onishi
  • Patent number: 9604363
    Abstract: A pickup device for picking up a target object from a plurality of objects randomly piled up in a container, and for placing the target object in a predetermined posture to a target location is provided. The device includes an approximate position obtaining part for obtaining information on an approximate position of the target object, based on information on a height distribution of the objects in the container, which is obtained by a first visual sensor. The device also includes a placement operation controlling part for controlling a robot so as to bring the target object into a predetermined position and posture relative to the target location, based on information on a position and posture of the target object relative to a robot, which is obtained by a second visual sensor.
    Type: Grant
    Filed: October 30, 2013
    Date of Patent: March 28, 2017
    Assignee: Fanuc Corporation
    Inventor: Kazunori Ban
  • Patent number: 9604365
    Abstract: An article transferring device with a robot. An image processing section includes an article detecting section for executing image capturing and detection of articles that move according to a conveying motion of a conveyor, with a first period allowing all of the articles to be captured and detected, and obtain initial position information of each of all articles; and an article tracking section for executing image capturing and detection of the articles that move according to the conveying motion of the conveyor, with a second period shorter than the first period, and obtain shifted position information of each article iteratively with the second period, the shifted position information being based on the initial position information. A robot controlling section is configured to control the robot by using the shifted position information, so as to make the robot hold and transfer each article while following the conveying motion of the conveyor.
    Type: Grant
    Filed: December 1, 2015
    Date of Patent: March 28, 2017
    Assignee: FANUC CORPORATION
    Inventors: Ichiro Kanno, Kentarou Koga
  • Patent number: 9586319
    Abstract: A robot-position detecting device includes: a position-data acquiring unit that acquires position data indicating actual positions of a robot; a position-data input unit that receives the position data output from the position-data acquiring unit; and a position calculating unit that calculates a computational position of the robot through linear interpolation using first and second position data input to the position-data input unit at different times.
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: March 7, 2017
    Assignee: Seiko Epson Corporation
    Inventor: Atsushi Asada
  • Patent number: 9585725
    Abstract: Embodiments include a robotic mechanism that may be utilized to position prosthetic implants in a patient's body. During joint replacement surgery and other surgical procedures, prosthetic implants may be placed in a patient's body. The robotic mechanism may be utilized to control movement of a cutting tool during resection of bone in a patient's body. The robotic mechanism includes a programmable computer which is connected with the force transmitting member by a motor. A force measurement assembly is connected with the force transmitting member and the computer. The output from the force measurement assembly is indicative of a resistance encountered by the force transmitting member. A position sensor is connected with the force transmitting member and the computer. The position sensor has an output indicative of the position of the force transmitting member.
    Type: Grant
    Filed: June 21, 2013
    Date of Patent: March 7, 2017
    Assignee: P Tech, LLC
    Inventor: Peter M Bonutti
  • Patent number: 9575492
    Abstract: A device and method are provided, the method including providing a device capable of at least semi-autonomous operation and enabling the device to autonomously gather at least one item and secure that item against unauthorized access while providing selective authorized access to the item while the item is in possession of the device.
    Type: Grant
    Filed: February 27, 2015
    Date of Patent: February 21, 2017
    Assignee: VECNA TECHNOLOGIES, INC.
    Inventors: Daniel Theobald, Thomas Allen, Josh Ornstein
  • Patent number: 9567080
    Abstract: A method and apparatus for determining actions for entities (4, 6) such that a goal is accomplished constraints are satisfied. The method comprises: determining an initial plan comprising actions that, if performed by the entities (4, 6), the goal would be accomplished; determining that a constraint would not be satisfied if the initial plan was implemented; and iteratively performing steps (i) to (v) until a final plan that accomplishes the goal and satisfies the is determined. Step (i) comprises identifying a constraint that is not satisfied in part of the current plan. Step (ii) comprises determining a remedy that, if implemented, satisfies the identified constraint. Step (iii) comprises updating the goal specification to include the remedy. Step (iv) comprises, using the updated goal specification, determining a further plan that accomplishes the goal and the remedy. Step (v) comprises determining whether or not the further plan satisfies each constraint.
    Type: Grant
    Filed: May 2, 2014
    Date of Patent: February 14, 2017
    Assignee: BAE SYSTEMS plc
    Inventors: John Paterson Bookless, Markus Deittert, Richard Norman Herring, Elizabeth Jane Cullen
  • Patent number: 9527212
    Abstract: A mobile self-propelled robot for autonomously carrying out actions. The robot includes a drive module for moving the robot over a floor area; a processing module for carrying out the activities during a processing stage; at least one sensor module for detecting information relating to the structure of the surroundings; a detector module configured to detect a displacement of the robot prior to or during the processing stage. Further, the robot includes a navigation module configured to navigate the robot over the floor area during the processing stage using a map of the surroundings, to store and manage one or more maps of the surroundings, and to carry out a self-positioning process if the detector module has detected a displacement of the robot. During the self-positioning process, the presence and the location of the robot within the stored maps are detected.
    Type: Grant
    Filed: February 8, 2013
    Date of Patent: December 27, 2016
    Assignee: Robart GmbH
    Inventors: Harold Artes, Dominik Seethaler, Michael Schahpar
  • Patent number: 9521994
    Abstract: In a method for image guided prostate cancer needle biopsy, a first registration is performed to match a first image of a prostate to a second image of the prostate. Third images of the prostate are acquired and compounded into a three-dimensional (3D) image. The prostate in the compounded 3D image is segmented to show its border. A second registration and then a third registration different from the second registration is performed on distance maps generated from the prostate borders of the first image and the compounded 3D image, wherein the first and second registrations are based on a biomechanical property of the prostate. A region of interest in the first image is mapped to the compounded 3D image or a fourth image of the prostate acquired with the second modality.
    Type: Grant
    Filed: May 6, 2010
    Date of Patent: December 20, 2016
    Assignee: Siemens Healthcare GmbH
    Inventors: Ali Kamen, Wolfgang Wein, Parmeshwar Khurd, Mamadou Diallo, Ralf Nanke, Jens Fehre, Berthold Kiefer, Martin Requardt, Clifford Weiss
  • Patent number: 9504369
    Abstract: A cleaning robot may have a body, a moving unit provided on the body to move the body in a cleaning space, a cleaning unit provided on the body to clean a floor of the cleaning space, a floor image obtaining unit configured to obtain a floor image of the cleaning space, and a control unit configured to determine if foreign substance is present on the floor of the cleaning space based on the floor image, and control the moving unit to move the body to a position of the foreign substance, in which the cleaning robot, by obtaining an image of a floor to be cleaned, detects the foreign substance that is not positioned on a moving track of the cleaning robot, and when the foreign substance is detected, moves to the position of the foreign substance to perform a cleaning.
    Type: Grant
    Filed: May 8, 2014
    Date of Patent: November 29, 2016
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Jea Yun So, Sang Sik Yoon, Joon Hyung Kwon, Shin Kim
  • Patent number: 9468152
    Abstract: A method of pruning plants includes generating a first series of images of a plant, identifying a first object displayed in the images as a fruit of the plant, collecting data associated with a state of the fruit, identifying a second object displayed in the images as a suspect plant component of the plant, comparing a parameter of the suspect plant component to a reference parameter associated with plant components to be pruned, in response to determining that the parameter of the suspect plant component sufficiently matches the reference parameter, identifying the suspect plant component as a plant component to be pruned from the plant, advancing an automated pruner toward the plant component, operating the automated pruner to sever the plant component from the plant, and while the automated pruner is operated to sever the plant component, generating a second series of images of one or more additional plants.
    Type: Grant
    Filed: June 9, 2015
    Date of Patent: October 18, 2016
    Assignee: Harvest Moon Automation Inc.
    Inventors: Stephen Jens, Janice Huxley Jens, Edward Dickinson
  • Patent number: 9469030
    Abstract: A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
    Type: Grant
    Filed: October 27, 2015
    Date of Patent: October 18, 2016
    Assignees: INTOUCH TECHNOLOGIES, IROBOT CORPORATION
    Inventors: Yulun Wang, Charles S. Jordan, Tim Wright, Michael Chan, Marco Pinter, Kevin Hanrahan, Daniel Sanchez, James Ballantyne, Cody Herzog, Blair Whitney, Fuji Lai, Kelton Temby, Eben Christopher Rauhut, Justin H. Kearns, Cheuk Wah Wong, Timothy Sturtevant Farlow
  • Patent number: 9440351
    Abstract: A method and/or computer program product controls operations of a robotic device. A robotic device receives a first signal from a positioning hardware device that is worn by a user. The first signal describes a relative location between the user and the robotic device. A second signal describes an angle between the user and the robotic device and between the user and the user-selected object. Based on the first signal, the second signal, and a record of object positions of objects within a predefined area of the user, the identification and location of the user-selected object is determined. A determination is made regarding whether or not the robotic device is authorized to perform a specific task on the user-selected object based on the location of the user-selected object. If authorized, the robotic device performs the specific task on the user-selected object.
    Type: Grant
    Filed: October 30, 2014
    Date of Patent: September 13, 2016
    Assignee: International Business Machines Corporation
    Inventors: James E. Bostick, John M. Ganci, Jr., Sarbajit K. Rakshit, Craig M. Trim
  • Patent number: 9404074
    Abstract: An incubation apparatus, including a temperature-controlled room adjusted to be a predetermined environment condition, and incubating a sample of an incubation container inside the temperature-controlled room, includes a carrying apparatus, an imaging section, and an image analyzing section. The carrying apparatus carries the incubation container in the temperature-controlled room. The imaging section photographs a whole of the incubation container inside the temperature-controlled room. The image analyzing section analyzes an operation state of the incubation apparatus or an incubating environment state of the sample based on a total observing image of the incubation container photographed at the imaging section, and outputs an error signal notifying an abnormality of the operation state or the incubating environment state in accordance with the analysis result.
    Type: Grant
    Filed: November 10, 2006
    Date of Patent: August 2, 2016
    Assignee: NIKON CORPORATION
    Inventor: Yasujiro Kiyota
  • Patent number: 9400498
    Abstract: An unmanned systems operator control system includes a hand held controller with a set of switches and control enumeration software specially configured to report a superset of virtual switches based on the physical switches. A core unit includes a first unmanned system control application subscribing to a first switch subset of the superset and outputting commands controlling a first unmanned system based on activation of the set of switches. A second unmanned system control application subscribes to a second switch subset of the superset and outputs commands controlling a second unmanned system based on activation of the set of switches. A mode switching subsystem is configured, in a first state, to map the set of switches to the first switch subset and, in a second state, to map the set of switches to the second switch subset.
    Type: Grant
    Filed: March 18, 2015
    Date of Patent: July 26, 2016
    Assignee: Foster-MIller, Inc.
    Inventors: Kurt Bruck, Boian Bentchev, Julie Shapiro, Todd Graham, Daniel Deguire
  • Patent number: 9358681
    Abstract: A work hanging apparatus includes a hanger line continuously conveying hangers each having a hook, a robot that has a hand with which a work having a hole is held and transfers the held work to a hanging location set in the hanger line, a controller controlling a movement of the hand to catch the hook of one of the hangers with the hole of the held work at the hanging location, a hole deviation detector that detects a positional deviation of the hole of the work, an attitude deviation detector that detects an attitudinal deviation of the hanger, and a corrector that corrects the movement of the hand according to the positional and attitudinal deviations.
    Type: Grant
    Filed: March 7, 2013
    Date of Patent: June 7, 2016
    Assignee: NHK Spring Co., Ltd.
    Inventors: Kotaro Nukui, Takashi Yajima
  • Patent number: 9333648
    Abstract: A robot control method and apparatus estimates an error based on information of an object obtained using a motor drive current sensor, a force sensor or a tactile sensor mounted at a robot hand and information of the object obtained using an optical sensor and compensates for the error. The robot control method and apparatus includes measuring information of an object using an optical sensor or calculating information of the object input by a user, moving a robot hand to the object based on the information of the object, controlling the thumb and fingers of the robot hand to contact the object based on the information of the object, determining whether the thumb and fingers have contacted the object through a motor drive current sensor, a force sensor or a tactile sensor, and grasping the object depending upon whether the thumb and the fingers have contacted the object.
    Type: Grant
    Filed: October 18, 2011
    Date of Patent: May 10, 2016
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Ji Young Kim, Yong Jae Kim, Kyung Shik Roh, Young Bo Shim
  • Patent number: 9333435
    Abstract: A customized 3D printing robot that is configured such that components of the robot are produced using a 3D printing technology. In particular, the customized 3D printing robot enables a user to directly select an appearance design for each component and assemble the components so that an entire appearance design of the robot can be customized.
    Type: Grant
    Filed: December 30, 2014
    Date of Patent: May 10, 2016
    Assignee: SEJONG INDUSTRY-ACADEMIA COOPERATION FOUNDATION, HONGIK UNIVERSITY
    Inventor: Na-young Kim
  • Patent number: 9332219
    Abstract: The present invention relates to a telepresence device that is capable of enabling various types of verbal or non-verbal interaction between a remote user and a local user. In accordance with an embodiment, the telepresence device may include a camera combined with direction control means; a projector provided on a top of the camera, and configured such that a direction of the projector is controlled by the direction control means along with a direction of the camera; and a control unit configured to control the direction of the camera by operating the direction control means, to extract an object, at which a remote user gazes, from an image acquired by the camera whose direction has been controlled, to generate a projection image related to the object, and to project the projection image around the object by controlling the projector.
    Type: Grant
    Filed: November 18, 2013
    Date of Patent: May 3, 2016
    Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jounghuem Kwon, Bumjae You, Shinyoung Kim, Kwangkyu Lee
  • Patent number: 9302391
    Abstract: An object gripping apparatus includes an image capturing unit for capturing a region including a plurality of works, an obtaining unit for obtaining distance information of the region, a measurement unit for measuring three-dimensional positions/orientations of a plurality of gripping-candidate works out of the plurality of works based on the image and distance information, thereby generating three-dimensional position/orientation information, a selection unit for selecting a gripping-target work based on the three-dimensional position/orientation information, a gripping unit for gripping the gripping-target work, and an updating unit for updating the three-dimensional position/orientation information by measuring three-dimensional positions/orientations of the gripping-candidate works at a time interval during gripping of the gripping-target work.
    Type: Grant
    Filed: December 6, 2011
    Date of Patent: April 5, 2016
    Assignee: CANON KABUSHIKI KAISHA
    Inventors: Yuichiro Iio, Yusuke Mitarai
  • Patent number: 9299118
    Abstract: Method and apparatus for measuring dimensionality of a component are disclosed herein. At least a first image may be collected from a first camera of an aperture or a first precision countersink of the component. The first camera may be in a first position and may operate in one or more luminance environments. At least a second image may be collected from a second camera of the aperture or a second precision countersink of the component. The second camera may be in a second position and may operate in an alternative lighting environment to differentiate characteristic features including hole and countersink against their intersection with the component. Positional and process refinement parameters may be calculated for, and provided to, one or more numerically controlled mechanisms, based on the collected first and second images.
    Type: Grant
    Filed: April 18, 2012
    Date of Patent: March 29, 2016
    Assignee: The Boeing Company
    Inventor: Michael D. McGraw
  • Patent number: 9289897
    Abstract: An apparatus for the automated removal of workpieces arranged in a container comprising a first object recognition device for detecting the workpieces and a first gripper for picking and removing the workpieces from the container; and a control for evaluating the data of the first object recognition device, for track planning and for controlling the first gripper; wherein an intermediate station on which the first gripper places the workpiece after the removal from the container; and a positioning apparatus which positions the workpieces more accurately starting from the intermediate station and/or singularizes them.
    Type: Grant
    Filed: June 27, 2013
    Date of Patent: March 22, 2016
    Assignee: Liebherr-Verzahntechnik GmbH
    Inventors: Thomas Mattern, David Haenschke, Bernhard Riedmiller, Alois Mundt
  • Patent number: 9241630
    Abstract: A mounted or portable device that allows for instant communication and notification between two parties is provided. Directed primarily at elderly family members and the like that live on their own, such a device includes a video screen for display of various media in order to provide memory inducements, as well as video communication on demand. Furthermore, such a device permits a remote user to contact the person with a request for instant verification of the person's condition. Failure to respond to such a request leads to automatically sent communications to pre-assigned recipients that attention for the person user may be required, thereby seeking the closest recipient proximally at that specific moment to provide such attention and/or notification to proper authorities that medical or other help is needed. The system also allows the direct user the capability of contacting medical/police/etc. authorities if attention is needed immediately as well.
    Type: Grant
    Filed: November 5, 2012
    Date of Patent: January 26, 2016
    Inventor: Frazier Watkins
  • Patent number: 9238304
    Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 19, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
  • Patent number: 9233470
    Abstract: Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 12, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee
  • Patent number: 9230275
    Abstract: A studio arrangement comprising a plurality of article transports, each article transport having a respective machine-readable code for tracking an article, an image capture device located at a first position on an axis wherein the image capture device is configured to capture images of articles placed on a surface wherein the axis intersects the surface at a second position, one or more light sources each positioned along a respective second axis that intersects the surface at a respective position wherein each light source is configured to illuminate articles placed on the surface, and a system comprising one or more data processing apparatus, the system configured to, for each of a plurality of articles, perform operations comprising: reading the code of a respective one of the transports, the transport being separate from the surface, in response to reading the code, causing the image capture device to acquire an image of the article, associating the article with the code, the image, and one or more param
    Type: Grant
    Filed: March 26, 2015
    Date of Patent: January 5, 2016
    Assignee: ThredUp, Inc.
    Inventors: John Hugh Voris, Michael Santhanam, Tom Ryan Gaffney, Vivek Daver, Syed Muhammad Usamah
  • Patent number: 9213891
    Abstract: An information processing device 200 of the present invention includes: a recognition result acquiring means 201 for acquiring respective recognition result information outputted by a plurality of recognition engines 211, 212 and 213 executing different recognition processes on recognition target data; and an integration recognition result outputting means 202 for outputting a new recognition result obtained by integrating the respective recognition result information acquired from the plurality of recognition engines. The recognition result acquiring means 201 is configured to acquire the respective recognition result information in a data format common to the plurality of recognition engines, from the plurality of recognition engines. The integration recognition result outputting means 202 is configured to integrate the respective recognition result information based on the respective recognition result information, and output as the new recognition result.
    Type: Grant
    Filed: January 19, 2012
    Date of Patent: December 15, 2015
    Assignee: NEC CORPORATION
    Inventors: Shinichiro Kamei, Nobuhisa Shiraishi, Takeshi Arikuma
  • Patent number: 9168656
    Abstract: A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: October 27, 2015
    Assignees: INTOUCH TECHNOLOGIES, INC., IROBOT CORPORATION
    Inventors: Yulun Wang, Charles S. Jordan, Tim Wright, Michael Chan, Marco Pinter, Kevin Hanrahan, Daniel Sanchez, James Ballantyne, Cody Herzog, Blair Whitney, Fuji Lai, Kelton Temby, Eben Christopher Rauhut, Justin H. Kearns, Cheuk Wah Wong, Timothy Sturtevant Farlow
  • Patent number: 9149932
    Abstract: A robot picking system includes a robot that picks up a work in a first stocker accommodating a plurality of works, a control device that controls an operation of the robot, and an image acquiring device that acquires image data including information related to the plurality of works. The control device includes a candidate data generating unit that generates candidate data including information of candidate works that are candidates of a picking-up target using the image data, and a target work selecting unit that selects a target work that is a picking-up target from the candidate works using the candidate data.
    Type: Grant
    Filed: February 17, 2014
    Date of Patent: October 6, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Yosuke Kamiya, Shingo Ando
  • Patent number: 9144905
    Abstract: Described is system for identifying functional elements of objects for robotic manipulation. The system causes a robot to manipulate an object, the robot having an audio sensor and touch sensors. A three-dimensional (3D) location of each audio event that produces a response during manipulation of the object is recorded. Additionally, a 3D location of each tactile event that produces a response during manipulation of the object is recorded. A 3D location of each audio and tactile event in 3D space is then determined. A 3D audio point cloud and a 3D tactile point cloud are generated. Then, the 3D audio point cloud and the 3D tactile point cloud are registered with a 3D model point cloud of the object. Finally, an annotated 3D model point cloud of the object is generated, which encodes the location of a functional element of the object.
    Type: Grant
    Filed: January 21, 2014
    Date of Patent: September 29, 2015
    Assignee: HRL Laboratories, LLC
    Inventors: Jivko Sinapov, Heiko Hoffmann
  • Patent number: 9132548
    Abstract: A robot picking system includes a robot including a gripper that picks up a target work in a first stocker accommodating a plurality of works, a control device that controls an operation of the robot, and an image acquiring device that acquires image data including information related to the target work. The control device includes a trajectory calculating unit that sets a first trajectory including a first zone in which a posture of the gripper is changed and a second zone in which the gripper having the changed posture approaches the target work that is a picking-up target.
    Type: Grant
    Filed: February 17, 2014
    Date of Patent: September 15, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Yosuke Kamiya, Shingo Ando
  • Patent number: 9125690
    Abstract: A device and method for determining a position of a medical device or part of a patient's body uses first and second position detection devices to obtain first and second positions, respectively, of the medical device or part of the patient's body, wherein the first position detection device is separate from the second position detection device. A first priority is assigned to the first position and a second priority is assigned to the second position, wherein the first and second priority are based on at least one input variable, and the first and second priority define a first and second weight factor to be applied to the respective first and second position. The position of the medical device or part of the patient's body is determined from the combination of the first position and the first weight factor, and the second position and the second weight factor.
    Type: Grant
    Filed: May 11, 2007
    Date of Patent: September 8, 2015
    Assignee: Brainlab AG
    Inventor: Richard Wohlgemuth
  • Patent number: 9124873
    Abstract: This invention provides a system and method for determining correspondence between camera assemblies in a 3D vision system implementation having a plurality of cameras arranged at different orientations with respect to a scene involving microscopic and near microscopic objects under manufacture moved by a manipulator, so as to acquire contemporaneous images of a runtime object and determine the pose of the object for the purpose of guiding manipulator motion. At least one of the camera assemblies includes a non-perspective lens. The searched 2D object features of the acquired non-perspective image, corresponding to trained object features in the non-perspective camera assembly can be combined with the searched 2D object features in images of other camera assemblies, based on their trained object features to generate a set of 3D features and thereby determine a 3D pose of the object.
    Type: Grant
    Filed: October 24, 2013
    Date of Patent: September 1, 2015
    Assignee: Cognex Corporation
    Inventors: Lifeng Liu, Aaron S. Wallack, Cyril C. Marrion, Jr., David J. Michael
  • Patent number: 9113065
    Abstract: To provide a network camera and a network camera system in which an image can be securely distributed to a designated distribution recipient. A network camera system 100 connected to a communication network 130 and having a network camera 110 and a plurality of PCs 120, the network camera 110 comprising an image pickup section 111 for picking up the image, a sending and receiving section 115 for distributing the picked up image to the PC 120, an image sending monitoring section 116 for monitoring whether or not the picked up image is distributed to the designated PC 120 among the plurality of PCs 120, and an accumulation memory 119 for accumulating the picked up image when the distribution of the image to the designated PC 120 is interrupted, and the PC 120 comprising an image receiving section 121 for receiving the picked up image distributed thereto.
    Type: Grant
    Filed: November 16, 2006
    Date of Patent: August 18, 2015
    Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.
    Inventors: Tsuyoshi Ogata, Yasuji Nakamura, Hirotaka Fujimura, Junichi Hamada
  • Patent number: 9110456
    Abstract: A robot machining with a flexible manipulator has an end effector which may either hold the tool to perform the machining or the work piece to be machined. A signal representative of a force applied by said tool to said work piece is used to control either the relative motion between the tool and the work piece to give a controlled material removal rate (CMRR) or the relative position between the tool or the work piece to provide deformation compensation or both CMRR and deformation compensation. A force sensor provides the signal for deformation compensation. For CMRR the signal may be obtained from either a force sensor or the current flowing in the motor of the robot's spindle. The force sensor can be mounted either on the robot or together with either the tool or work piece adjacent to the robot.
    Type: Grant
    Filed: September 6, 2005
    Date of Patent: August 18, 2015
    Assignee: ABB Research Ltd.
    Inventors: Hui Zhang, ZhongXue Gan, Jianjun Wang, George Zhang
  • Patent number: 9104981
    Abstract: This robot teaching system includes a teaching tool including an operation portion operated by a user to specify teaching positions and specifying the teaching positions, a measuring portion measuring positions and postures of the teaching tool, and a control portion determining the teaching positions for a robot. The robot teaching system is configured to specify the teaching positions continuously while the user operates the operation portion of the teaching tool.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: August 11, 2015
    Assignee: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Yukiko Sawada, Tomoyuki Sekiyama, Kenichi Motonaga
  • Patent number: 9089966
    Abstract: A workpiece pick-up apparatus including: a hand for gripping a workpiece; a robot for bringing the hand into a desired gripping position or posture; a sensor for performing three-dimensional measurement of the workpiece to obtain workpiece measurement data; a storage medium for accumulating at least hand profile data; an information processing unit for calculating the gripping position or posture based on data from the sensor and data from the storage medium; and a control unit for controlling the robot based on the gripping position or posture calculated by the information processing unit. The information processing unit includes an optimum gripping candidate creating section for directly deriving the gripping position or posture based on the workpiece measurement data and the hand profile data.
    Type: Grant
    Filed: July 13, 2011
    Date of Patent: July 28, 2015
    Assignee: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Yukiyasu Domae, Yasuo Kitaaki, Haruhisa Okuda, Kazuhiko Sumi
  • Patent number: 9085424
    Abstract: An extendable boom conveyor system having a first boom element, a controller, an input device, and a second boom element. The controller, using the input device, can adjust the position of the first boom element to maintain a distance relative to a user. A method performed by an extendible conveyor system can include detecting a position of a user using the input device, moving the first boom element to a position relative to the position of the user, detecting a user input using the input device, moving the first boom element to a new position according to the user input, and transporting parcels loaded by the user on the conveying surfaces.
    Type: Grant
    Filed: May 2, 2014
    Date of Patent: July 21, 2015
    Assignee: Siemens Industry, Inc.
    Inventors: Michael D. Carpenter, Jeffrey Eugene Gilb, James M. Pippin