Vision Sensor (e.g., Camera, Photocell) Patents (Class 700/259)
  • Patent number: 10322508
    Abstract: A method is provided for operating a robotic device having a kinematic chain of movable components. The method includes: detecting respective values at least one characteristic of a plurality of the movable components by sensors arranged on the kinematic chain or in the vicinity of the kinematic chain; ascertaining a maximum value based on the detected values; comparing the ascertained maximum value with a predefined first safety limit by a controller of the robotic device; and adjusting the at least one characteristic or a further characteristic of the kinematic chain when the ascertained maximum value has a predefined relationship with the first safety limit, in order to increase the operating safety of the robotic device.
    Type: Grant
    Filed: November 25, 2015
    Date of Patent: June 18, 2019
    Assignees: KUKA Roboter GmbH, Siemens Aktiengesellschaft
    Inventors: Abhinav Gulhar, Philip Mewes, Holger Mönnich, Sabine Thürauf
  • Patent number: 10209131
    Abstract: A system and method for aligning a light beam in a spectroscopic measuring device such as a pump-probe device is provided. The system and method comprise a first motorized mirror (66b) positioned to receive and transmit a light beam (60a); a second motorized mirror (66c) positioned relative to the first mirror to receive the light beam from the first mirror and transmit the light beam to a delay line (64); a third mirror (78) positioned to receive the light beam from the delay line and transmit said light beam to a detector (80); and a computer-based processor (82) in communication with the detector and the first and second mirrors, the processor configured to a) receive and process data relating to the light beam from the detector, and b) cause movement of the first and second mirrors to change an angle of the mirrors based on the data relating to the light beam.
    Type: Grant
    Filed: August 19, 2016
    Date of Patent: February 19, 2019
    Inventors: Alexander Sobolev, Nikita Kurakin, Alexey Gusev
  • Patent number: 10196141
    Abstract: A method, device, and system for detecting transparent elements in a vehicle environment are described. In some examples, this may include accessing a first image depicting a scene. The first image may include a first polarization characteristic. A second image depicting the scene may also be accessed. The second image may include a second polarization characteristic. It may be determined that the scene includes an element having transparent properties based at least in part on the first image and the second image.
    Type: Grant
    Filed: September 21, 2016
    Date of Patent: February 5, 2019
    Assignee: Amazon Technologies, Inc.
    Inventors: Chengwu Cui, Barry James O'Brien
  • Patent number: 10170146
    Abstract: According to one embodiment, a method for mounting a wireless capture device to an accessor to capture images of an operation in a data storage library includes mounting a bracket to the accessor, where the accessor is configured to transport data storage cartridges in a library environment. Moreover, the bracket is configured to receive a wireless image capture device therein, and the bracket is mounted in a position to orient the wireless image capture device to capture images while the accessor is transporting data storage cartridges.
    Type: Grant
    Filed: February 6, 2018
    Date of Patent: January 1, 2019
    Inventors: Luis R. Macias, Shawn M. Nave
  • Patent number: 10148915
    Abstract: The invention relates to the field of security, more particularly, to a lighting system with monitoring and alarm function. The illuminating module can supply sufficient light for the monitoring are of the monitoring module, so as to improve the clarity of the monitoring data collected by the monitoring module, and the mobile terminal can control the monitoring module and the illuminating module by the control instruction remotely, to provide convenience for user's remote operation; in addition, the mobile terminal can acquire the illuminating information and the monitoring data easily, which realizes user's remote monitoring, and facilities users to grasp the situation of monitoring area, and when huge amount of smog appears or some human activities appearing at a region that should have no people in monitoring area during monitor process, it will promptly reminds uses, which provides a good security effect.
    Type: Grant
    Filed: August 17, 2016
    Date of Patent: December 4, 2018
    Inventors: Maojun Cao, Yunhai Du, Hu Fang, Bin Yu, Yinyong Zhou
  • Patent number: 10134123
    Abstract: The present disclosure provides a method and a system for inspecting goods. The method comprises steps of: obtaining a transmission image of inspected goods; processing the transmission image to obtain a suspicious region; extracting local texture features of the suspicious region and classifying the local texture features of the suspicious region based on a pre-created model to obtain a classification result; extracting a contour line shape feature of the suspicious region and comparing the contour line shape feature with a pre-created standard template to obtain a comparison result; and determining that the suspicious region contains a high atomic number matter based on the classification result and the comparison result.
    Type: Grant
    Filed: September 28, 2016
    Date of Patent: November 20, 2018
    Assignees: Tsinghua University, Nuctech Company Limited
    Inventors: Yuanjing Li, Ziran Zhao, Yaohong Liu, Qili Wang, Qiang Li, Jianping Gu
  • Patent number: 10107897
    Abstract: A method for evaluating the distance type of the measured distance comprises a sample extracting step for extracting a plurality of preliminary samples around a predicted pose; a reference set calculating step for calculating a reference distance set corresponding to each preliminary sample through applying each preliminary sample to a reference distance calculating algorithm which is previously registered, wherein the reference distance set comprises reference distances corresponding to each of a plurality of distance types; a distance type extracting step for extracting a distance type corresponding to each of the reference distance sets based on a smallest distance error among distance errors between each reference distance which compose the reference distance set and the measured distance; and a distance type evaluating step for evaluating a distance type of the measured distance based on the distance type which is extracted in correspondence with each reference distance set.
    Type: Grant
    Filed: October 15, 2014
    Date of Patent: October 23, 2018
    Assignee: Korea University Research and Business Foundation
    Inventors: Woo Jin Chung, Ji Woong Kim
  • Patent number: 10093022
    Abstract: A robotic system validates brake bleeding by detecting one or more forces generated by a machine assembly acting to move a brake lever of a vehicle in order to open a valve of an air brake system of the vehicle. The system also detects displacement of the machine assembly as the machine assembly acts to move the brake lever, monitors one or more sounds generated one or more of during or after the machine assembly acts to move the brake lever, and determines that the brake lever has been moved to a position to open the valve of the air brake system to release the air brake system based on the one or more forces that are detected, the displacement that is detected, and/or the one or more sounds that are monitored.
    Type: Grant
    Filed: March 2, 2016
    Date of Patent: October 9, 2018
    Inventors: Omar Al Assad, Douglas Forman, Charles Theurer, Balajee Kannan, Huan Tan
  • Patent number: 10054934
    Abstract: An industrial automation system includes a first computing device that communicatively couples to a second computing device via a communication network. The first computing device receives image data capturing a visual representation of industrial automation equipment in operation via the communication network, in which the image data is acquired by the second computing device; determines an identity of the industrial automation equipment based at least in part on the image data; determines relevant information regarding the industrial automation equipment based at least in part on the identity of the industrial automation equipment; and instructs the industrial automation equipment, the second computing device, the first computing device, the industrial automation system, a user, or any combination thereof to perform one or more operations in the industrial automation system based at least in part on the relevant information.
    Type: Grant
    Filed: June 30, 2015
    Date of Patent: August 21, 2018
    Inventors: Jimi R. Michalscheck, Jessica L. Korpela, Kyle K. Reissner, David A. Vasko, Matthew W. Fordenwalt, John J. Jauquet, Matthew R. Ericsson, Andrew Wilber, Kelly A. Michalscheck
  • Patent number: 10052755
    Abstract: A robot system includes a conveying device, a robot, a control unit for controlling operation of the robot by transmitting a drive signal to a motor of the robot, and for outputting a driving speed control signal to the conveying device, and an encoder for detecting a conveying speed of the conveying device in a regular operation mode, where, when the robot is to be operated in a teach mode and a test mode, the control unit transmits a drive signal to the robot such that an operation speed is reduced by a received override value, and outputs the driving speed control signal such that the conveying speed of the conveying device is reduced by the override value.
    Type: Grant
    Filed: August 7, 2017
    Date of Patent: August 21, 2018
    Inventor: Kentaro Koga
  • Patent number: 9984291
    Abstract: An information processing apparatus includes a distance information acquisition unit configured to acquire distance information with respect to a measurement target object to determine interference between the object and a container for storing the object when the object is to be gripped, a derivation unit configured to derive a position and an orientation of the measurement target object based on the distance information and a partial model representing a partial shape of the measurement target object, and an interference determination unit configured, based on the position and the orientation derived by the derivation unit and a whole model representing a whole shape of the measurement target object, to determine whether interference has occurred between the measurement target object and the container for storing the measurement target object.
    Type: Grant
    Filed: March 22, 2016
    Date of Patent: May 29, 2018
    Inventor: Hiroshi Okazaki
  • Patent number: 9962055
    Abstract: The present disclosure relates to a mute operation method and a mute operation apparatus for an automatic cleaning device, the mute operation method includes: receiving a mute instruction; planning a mute cleaning path according to the mute instruction; switching to a mute mode and performing a cleaning operation according to the mute cleaning path.
    Type: Grant
    Filed: July 29, 2016
    Date of Patent: May 8, 2018
    Assignee: Xiaomi Inc.
    Inventors: Pengfei Zhang, Yongfeng Xia, Heng Qu, Tiejun Liu
  • Patent number: 9956688
    Abstract: A system for predicting a robotic power disconnection includes: a controller; and a robot controllable by the controller, the robot including: a power connector configured to provide power to the robot; and a sensor operably connected to the controller, the sensor configured to detect a change in a field that varies with a changing condition of the power connector, the sensor further configured to alert the controller regarding the change in the field, the controller configured to adjust current through the power connector in response to the alert.
    Type: Grant
    Filed: April 12, 2016
    Date of Patent: May 1, 2018
    Assignee: Fetch Robotius, Inc.
    Inventors: Michael Ferguson, Derek King
  • Patent number: 9948411
    Abstract: An automatic system level testing (ASLT) system for testing smart devices is disclosed. The system comprises a system controller operable to be coupled with a smart device in an enclosure, wherein the system controller comprises a memory comprising test logic and a processor. The system also comprises the enclosure, wherein the enclosure comprises a plurality of components, the plurality of components comprising: (i) a robotic arm comprising a stylus, wherein the stylus is operable to manipulate the smart device to simulate human interaction therewith; and (ii) a platform comprising a device holder, wherein the device holder is operable to receive a smart device inserted there into. The processor is configured to automatically control the smart device and the plurality of components in accordance with the test logic.
    Type: Grant
    Filed: November 23, 2015
    Date of Patent: April 17, 2018
    Assignee: W2BI, INC.
    Inventors: Derek Diperna, Ira Leventhal, Keith Schaub, Artun Kutchuk
  • Patent number: 9927814
    Abstract: A method for localization of robots in a territory of interest includes: providing a mobile robot comprising a processor configured to compute an estimated pose of the mobile robot in a map of a territory of interest using a particle filter comprising a particle; updating, by the processor, a pose in the map of the particle; deciding, by the processor, whether to retain the particle for the next cycle of the particle filter or to eliminate the particle for the next cycle of the particle filter; and sampling the particle filter, by the processor, so as to achieve localization of robots in a territory of interest.
    Type: Grant
    Filed: March 28, 2016
    Date of Patent: March 27, 2018
    Assignee: Fetch Robotics, Inc.
    Inventors: Melonee Wise, Michael Ferguson
  • Patent number: 9927966
    Abstract: In a method for enabling an interaction between an assistance device and a medical apparatus and/or an operator and/or a patient, input information are acquired by an input interface of the assistance device, the input information including information of the medical apparatus and/or of the operator and/or of the patient. The acquired input information is transferred from the input interface to a computer. The input information are processed by the computing unit so as to generate an interaction instruction on the basis of the processed input information. The generated interaction instruction is transferred from the computing unit to an interaction interface of the assistance device and an interaction with an environment of the assistance device is performed on the basis of the interaction instruction via the interaction interface.
    Type: Grant
    Filed: March 6, 2015
    Date of Patent: March 27, 2018
    Assignee: Siemens Aktiengesellschaft
    Inventor: Stefan Popescu
  • Patent number: 9918603
    Abstract: A sensor module and a robot cleaner including the sensor module may provide accurate sensing of an obstacle and prevent an erroneous sensing of an obstacle. The robot cleaner may include a body including a cleaning unit to remove foreign substances from a surface of a floor, a cover to cover an upper portion of the body, a sensor module including an obstacle sensor mounted to sense an obstacle, and a sensor window provided at one side of the sensor module. The sensor module may include a light emitting device to emit light through the sensor window, a light receiving reflector on which light reflected from the obstacle is incident, and a light shielding portion interposed between the light emitting device and the light receiving reflector to block the light emitted from the light emitting device from being incident upon the light receiving reflector.
    Type: Grant
    Filed: April 8, 2014
    Date of Patent: March 20, 2018
    Inventors: Byung Chan Kim, Sang Sik Yoon, Kyong Su Kim, Dong Won Kim, Jea Yun So, Yeon Kyu Jeong
  • Patent number: 9921559
    Abstract: A condition control of a robot cleaner is performed or service is provided with a user using the robot cleaner to improve the convenience of the user. Various data items obtained through a network connection are used in the condition control or service providing to estimate/determine a behavior, condition, or request of the user. Specifically, the operation of the robot cleaner is controlled based on operations of other associated devices disposed in the same room where the robot cleaner runs.
    Type: Grant
    Filed: May 26, 2016
    Date of Patent: March 20, 2018
    Inventor: Tomomi Tsubota
  • Patent number: 9892559
    Abstract: A PC includes: an image data setting section for (i) setting, to background image data indicative of a background image, apparatus image data obtained by capturing an image of an apparatus and serving as referential image data to be referred to for identifying the apparatus, and (ii) setting, to the apparatus image data, dynamic part image data indicative of a dynamic part image positioned on the apparatus image; and an address setting section for (i) associating, with the dynamic part image data, (a) an address for specifying a storage area of a memory in which storage area data to be accessed by a portable terminal device is stored and (b) address substitutive information to be substituted for the address, and (ii) generating address display data to be used to display the address substitutive information instead of the address.
    Type: Grant
    Filed: April 2, 2015
    Date of Patent: February 13, 2018
    Assignee: Digital Electronics Corporation
    Inventors: Minoru Yoshida, Toru Terada
  • Patent number: 9868072
    Abstract: An interactive play device, method and apparatus, is disclosed that includes means for generating a plurality of interactions, input control mechanisms, means for storing responses to interactions, and control means to select the next interaction based on memorized responses. This invention provides a new class of interactive play devices, which is founded on personalizing a play device so that its current functionality is based on how the player has interacted with it in prior playing sessions. The invention also discloses a doll device and a car device, which operate in a plurality of states that mimic human behavior. Further, the specification describes a game during which the player is challenged to transform a play device from an initial state to a desired state by providing appropriate responses to interactions initiated by the device.
    Type: Grant
    Filed: December 7, 2005
    Date of Patent: January 16, 2018
    Assignee: Interactive Play Devices LLC
    Inventor: Nabil N. Ghaly
  • Patent number: 9841271
    Abstract: A three-dimensional measurement apparatus selects points corresponding to geometric features of a three-dimensional shape model of a target object, projects a plurality of selected points corresponding to the geometric features onto a range image based on approximate values indicating the position and orientation of the target object and imaging parameters at the time of imaging of the range image, searches regions of predetermined ranges respectively from the plurality of projected points for geometric features on the range image which correspond to the geometric features of the three-dimensional shape model, and associates these geometric features with each other. The apparatus then calculates the position and orientation of the target object using differences of distances on a three-dimensional space between the geometric features of the three-dimensional shape model and those on the range image, which are associated with each other.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: December 12, 2017
    Assignee: Canon Kabushiki Kaisha
    Inventors: Yusuke Nakazato, Kazuhiko Kobayashi
  • Patent number: 9817395
    Abstract: This disclosure describes, according to some embodiments, a robot network comprising a plurality of robot units capable of providing navigational guidance to patrons requesting navigational assistance. In an example method, upon receiving a request from a first patron, the method selects an available robot unit from among a plurality of robot units comprising the robot network based on an estimated time to reach the first patron's location. The estimated time to reach the first patron's location is determined based on status updates of other of the robot units comprising the robot network. The method further assigns the available robot unit to the first patron, navigates the assigned robot unit to the first patron's location, engages the first patron using the assigned robot unit, and guides the first patron toward the requested destination using the assigned robot unit.
    Type: Grant
    Filed: March 31, 2016
    Date of Patent: November 14, 2017
    Inventors: Emrah Akin Sisbot, Halit Bener Suay
  • Patent number: 9789612
    Abstract: A method of operating a robot includes electronically receiving images and augmenting the images by overlaying a representation of the robot on the images. The robot representation includes user-selectable portions. The method includes electronically displaying the augmented images and receiving an indication of a selection of at least one user-selectable portion of the robot representation. The method also includes electronically displaying an intent to command the selected at least one user-selectable portion of the robot representation, receiving an input representative of a user interaction with at least one user-selectable portion, and issuing a command to the robot based on the user interaction.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: October 17, 2017
    Inventors: Orin P. F. Hoffman, Peter Keefe, Eric Smith, John Wang, Andrew Labrecque, Brett Ponsler, Susan Macchia, Brian J. Madge
  • Patent number: 9764468
    Abstract: Apparatus and methods for training and operating of robotic devices. Robotic controller may comprise a predictor apparatus configured to generate motor control output. The predictor may be operable in accordance with a learning process based on a teaching signal comprising the control output. An adaptive controller block may provide control output that may be combined with the predicted control output. The predictor learning process may be configured to learn the combined control signal. Predictor training may comprise a plurality of trials. During initial trial, the control output may be capable of causing a robot to perform a task. During intermediate trials, individual contributions from the controller block and the predictor may be inadequate for the task. Upon learning, the control knowledge may be transferred to the predictor so as to enable task execution in absence of subsequent inputs from the controller. Control output and/or predictor output may comprise multi-channel signals.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: September 19, 2017
    Assignee: Brain Corporation
    Inventors: Eugene Izhikevich, Oleg Sinyavskiy, Jean-Baptiste Passot
  • Patent number: 9764484
    Abstract: A robot system includes a robot installed on a floor surface, an irradiating unit irradiating visible light onto the floor surface, and a forcible stopping unit forcibly stopping the robot from moving when there occurs an abnormality. The radiating unit is controlled by a control unit such that visible light is radiated onto motion and latent areas on the floor surface. The motion area is an area occupied on the floor surface by a space within which the robot is allowed to move during execution of a task. The latent area is an area occupied on the floor surface by a space within which the robot is likely to move until the robot is forcibly stopped from moving by the forcible stopping unit. The control unit sets the latent area depending on whether or not an execution acceleration of the robot is higher than or equal to a predetermined acceleration.
    Type: Grant
    Filed: July 1, 2016
    Date of Patent: September 19, 2017
    Inventor: Syu Katayama
  • Patent number: 9740191
    Abstract: Systems and methods for calibrating the location of an end effector-carrying apparatus relative to successive workpieces before the start of a production manufacturing operation. The location calibration is performed using a positioning system. These disclosed methodologies allow an operator to program (or teach) the robot motion path once and reuse that path for subsequent structures by using relative location feedback from a measurement system to adjust the position and orientation offset of the robot relative to the workpiece. When each subsequent workpiece comes into the robotic workcell, its location (i.e., position and orientation) relative to the robot may be different than the first workpiece that was used when developing the initial program. The disclosed systems and methods can also be used to compensate for structural differences between workpieces intended to have identical structures.
    Type: Grant
    Filed: February 12, 2015
    Date of Patent: August 22, 2017
    Assignee: The Boeing Company
    Inventors: James J. Troy, Barry A. Fetzer, Scott W. Lea, Gary E. Georgeson
  • Patent number: 9741108
    Abstract: An image processing apparatus, connected to an imaging part to capture an image of workpieces conveyed on a conveyer, includes an interface that receives a signal indicating a travel distance of the conveyer, an interface that communicates with a control device for controlling a moving machine disposed downstream of an imaging area of a imaging part, a positional information acquisition unit that processes the image captured by the imaging part and thereby acquiring positional information of a pre-registered workpiece in the image, a travel distance obtaining unit that obtains the travel distance of the conveyer synchronized with the control device, an initiating unit that initiates the capturing by the imaging part in response to an imaging command, and a transmission unit that transmits, to the control device, the positional information and the travel distance upon the capturing of the image used to acquire the positional information.
    Type: Grant
    Filed: August 2, 2013
    Date of Patent: August 22, 2017
    Assignee: OMRON Corporation
    Inventors: Yasuyuki Ikeda, Yuichi Doi, Naoya Nakashita
  • Patent number: 9678654
    Abstract: A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.
    Type: Grant
    Filed: December 19, 2014
    Date of Patent: June 13, 2017
    Assignee: Google Inc.
    Inventors: Adrian Wong, Xiaoyu Miao
  • Patent number: 9659363
    Abstract: A positioning apparatus includes: a calculation unit that calculates an amount of deviation between a position of a feature point of a reference workpiece and a feature point of a workpiece by comparing a relative position of an imaging unit with respect to a table when the workpiece is imaged by the imaging unit with a reference relative position, and comparing a position of a feature point of the workpiece in the image of the workpiece imaged by the imaging unit with a reference point image position; and a program changing unit that generates a correction amount such that the amount of deviation calculated by the calculation unit becomes zero, and thereby changes a program of the machine tool.
    Type: Grant
    Filed: February 8, 2016
    Date of Patent: May 23, 2017
    Inventor: Kenichi Ogawa
  • Patent number: 9616573
    Abstract: An end effector can be removed to a target position and calibration loads can be reduced, even if there is an error in a kinematic operation in a robot main body and/or cameras. A positional deviation integral torque obtained by integrating a value corresponding to a positional deviation is applied to joints while being superimposed with torques based on angular deviations. If movements of the joints stop or are about to stop before the end effector reaches a target position due to an error in a kinematic operation, the positional deviation integral torque increases with time, to move said joints and move the end effector to the target position. Thus, the end effector can be reliably moved to the target position by the positional deviation integral torque, and calibration loads can be reduced.
    Type: Grant
    Filed: May 23, 2013
    Date of Patent: April 11, 2017
    Inventors: Sadao Kawamura, Hiroyuki Onishi
  • Patent number: 9604365
    Abstract: An article transferring device with a robot. An image processing section includes an article detecting section for executing image capturing and detection of articles that move according to a conveying motion of a conveyor, with a first period allowing all of the articles to be captured and detected, and obtain initial position information of each of all articles; and an article tracking section for executing image capturing and detection of the articles that move according to the conveying motion of the conveyor, with a second period shorter than the first period, and obtain shifted position information of each article iteratively with the second period, the shifted position information being based on the initial position information. A robot controlling section is configured to control the robot by using the shifted position information, so as to make the robot hold and transfer each article while following the conveying motion of the conveyor.
    Type: Grant
    Filed: December 1, 2015
    Date of Patent: March 28, 2017
    Inventors: Ichiro Kanno, Kentarou Koga
  • Patent number: 9604363
    Abstract: A pickup device for picking up a target object from a plurality of objects randomly piled up in a container, and for placing the target object in a predetermined posture to a target location is provided. The device includes an approximate position obtaining part for obtaining information on an approximate position of the target object, based on information on a height distribution of the objects in the container, which is obtained by a first visual sensor. The device also includes a placement operation controlling part for controlling a robot so as to bring the target object into a predetermined position and posture relative to the target location, based on information on a position and posture of the target object relative to a robot, which is obtained by a second visual sensor.
    Type: Grant
    Filed: October 30, 2013
    Date of Patent: March 28, 2017
    Assignee: Fanuc Corporation
    Inventor: Kazunori Ban
  • Patent number: 9586319
    Abstract: A robot-position detecting device includes: a position-data acquiring unit that acquires position data indicating actual positions of a robot; a position-data input unit that receives the position data output from the position-data acquiring unit; and a position calculating unit that calculates a computational position of the robot through linear interpolation using first and second position data input to the position-data input unit at different times.
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: March 7, 2017
    Assignee: Seiko Epson Corporation
    Inventor: Atsushi Asada
  • Patent number: 9585725
    Abstract: Embodiments include a robotic mechanism that may be utilized to position prosthetic implants in a patient's body. During joint replacement surgery and other surgical procedures, prosthetic implants may be placed in a patient's body. The robotic mechanism may be utilized to control movement of a cutting tool during resection of bone in a patient's body. The robotic mechanism includes a programmable computer which is connected with the force transmitting member by a motor. A force measurement assembly is connected with the force transmitting member and the computer. The output from the force measurement assembly is indicative of a resistance encountered by the force transmitting member. A position sensor is connected with the force transmitting member and the computer. The position sensor has an output indicative of the position of the force transmitting member.
    Type: Grant
    Filed: June 21, 2013
    Date of Patent: March 7, 2017
    Assignee: P Tech, LLC
    Inventor: Peter M Bonutti
  • Patent number: 9575492
    Abstract: A device and method are provided, the method including providing a device capable of at least semi-autonomous operation and enabling the device to autonomously gather at least one item and secure that item against unauthorized access while providing selective authorized access to the item while the item is in possession of the device.
    Type: Grant
    Filed: February 27, 2015
    Date of Patent: February 21, 2017
    Inventors: Daniel Theobald, Thomas Allen, Josh Ornstein
  • Patent number: 9567080
    Abstract: A method and apparatus for determining actions for entities (4, 6) such that a goal is accomplished constraints are satisfied. The method comprises: determining an initial plan comprising actions that, if performed by the entities (4, 6), the goal would be accomplished; determining that a constraint would not be satisfied if the initial plan was implemented; and iteratively performing steps (i) to (v) until a final plan that accomplishes the goal and satisfies the is determined. Step (i) comprises identifying a constraint that is not satisfied in part of the current plan. Step (ii) comprises determining a remedy that, if implemented, satisfies the identified constraint. Step (iii) comprises updating the goal specification to include the remedy. Step (iv) comprises, using the updated goal specification, determining a further plan that accomplishes the goal and the remedy. Step (v) comprises determining whether or not the further plan satisfies each constraint.
    Type: Grant
    Filed: May 2, 2014
    Date of Patent: February 14, 2017
    Assignee: BAE SYSTEMS plc
    Inventors: John Paterson Bookless, Markus Deittert, Richard Norman Herring, Elizabeth Jane Cullen
  • Patent number: 9527212
    Abstract: A mobile self-propelled robot for autonomously carrying out actions. The robot includes a drive module for moving the robot over a floor area; a processing module for carrying out the activities during a processing stage; at least one sensor module for detecting information relating to the structure of the surroundings; a detector module configured to detect a displacement of the robot prior to or during the processing stage. Further, the robot includes a navigation module configured to navigate the robot over the floor area during the processing stage using a map of the surroundings, to store and manage one or more maps of the surroundings, and to carry out a self-positioning process if the detector module has detected a displacement of the robot. During the self-positioning process, the presence and the location of the robot within the stored maps are detected.
    Type: Grant
    Filed: February 8, 2013
    Date of Patent: December 27, 2016
    Assignee: Robart GmbH
    Inventors: Harold Artes, Dominik Seethaler, Michael Schahpar
  • Patent number: 9521994
    Abstract: In a method for image guided prostate cancer needle biopsy, a first registration is performed to match a first image of a prostate to a second image of the prostate. Third images of the prostate are acquired and compounded into a three-dimensional (3D) image. The prostate in the compounded 3D image is segmented to show its border. A second registration and then a third registration different from the second registration is performed on distance maps generated from the prostate borders of the first image and the compounded 3D image, wherein the first and second registrations are based on a biomechanical property of the prostate. A region of interest in the first image is mapped to the compounded 3D image or a fourth image of the prostate acquired with the second modality.
    Type: Grant
    Filed: May 6, 2010
    Date of Patent: December 20, 2016
    Assignee: Siemens Healthcare GmbH
    Inventors: Ali Kamen, Wolfgang Wein, Parmeshwar Khurd, Mamadou Diallo, Ralf Nanke, Jens Fehre, Berthold Kiefer, Martin Requardt, Clifford Weiss
  • Patent number: 9504369
    Abstract: A cleaning robot may have a body, a moving unit provided on the body to move the body in a cleaning space, a cleaning unit provided on the body to clean a floor of the cleaning space, a floor image obtaining unit configured to obtain a floor image of the cleaning space, and a control unit configured to determine if foreign substance is present on the floor of the cleaning space based on the floor image, and control the moving unit to move the body to a position of the foreign substance, in which the cleaning robot, by obtaining an image of a floor to be cleaned, detects the foreign substance that is not positioned on a moving track of the cleaning robot, and when the foreign substance is detected, moves to the position of the foreign substance to perform a cleaning.
    Type: Grant
    Filed: May 8, 2014
    Date of Patent: November 29, 2016
    Inventors: Jea Yun So, Sang Sik Yoon, Joon Hyung Kwon, Shin Kim
  • Patent number: 9469030
    Abstract: A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
    Type: Grant
    Filed: October 27, 2015
    Date of Patent: October 18, 2016
    Inventors: Yulun Wang, Charles S. Jordan, Tim Wright, Michael Chan, Marco Pinter, Kevin Hanrahan, Daniel Sanchez, James Ballantyne, Cody Herzog, Blair Whitney, Fuji Lai, Kelton Temby, Eben Christopher Rauhut, Justin H. Kearns, Cheuk Wah Wong, Timothy Sturtevant Farlow
  • Patent number: 9468152
    Abstract: A method of pruning plants includes generating a first series of images of a plant, identifying a first object displayed in the images as a fruit of the plant, collecting data associated with a state of the fruit, identifying a second object displayed in the images as a suspect plant component of the plant, comparing a parameter of the suspect plant component to a reference parameter associated with plant components to be pruned, in response to determining that the parameter of the suspect plant component sufficiently matches the reference parameter, identifying the suspect plant component as a plant component to be pruned from the plant, advancing an automated pruner toward the plant component, operating the automated pruner to sever the plant component from the plant, and while the automated pruner is operated to sever the plant component, generating a second series of images of one or more additional plants.
    Type: Grant
    Filed: June 9, 2015
    Date of Patent: October 18, 2016
    Assignee: Harvest Moon Automation Inc.
    Inventors: Stephen Jens, Janice Huxley Jens, Edward Dickinson
  • Patent number: 9440351
    Abstract: A method and/or computer program product controls operations of a robotic device. A robotic device receives a first signal from a positioning hardware device that is worn by a user. The first signal describes a relative location between the user and the robotic device. A second signal describes an angle between the user and the robotic device and between the user and the user-selected object. Based on the first signal, the second signal, and a record of object positions of objects within a predefined area of the user, the identification and location of the user-selected object is determined. A determination is made regarding whether or not the robotic device is authorized to perform a specific task on the user-selected object based on the location of the user-selected object. If authorized, the robotic device performs the specific task on the user-selected object.
    Type: Grant
    Filed: October 30, 2014
    Date of Patent: September 13, 2016
    Assignee: International Business Machines Corporation
    Inventors: James E. Bostick, John M. Ganci, Jr., Sarbajit K. Rakshit, Craig M. Trim
  • Patent number: 9404074
    Abstract: An incubation apparatus, including a temperature-controlled room adjusted to be a predetermined environment condition, and incubating a sample of an incubation container inside the temperature-controlled room, includes a carrying apparatus, an imaging section, and an image analyzing section. The carrying apparatus carries the incubation container in the temperature-controlled room. The imaging section photographs a whole of the incubation container inside the temperature-controlled room. The image analyzing section analyzes an operation state of the incubation apparatus or an incubating environment state of the sample based on a total observing image of the incubation container photographed at the imaging section, and outputs an error signal notifying an abnormality of the operation state or the incubating environment state in accordance with the analysis result.
    Type: Grant
    Filed: November 10, 2006
    Date of Patent: August 2, 2016
    Inventor: Yasujiro Kiyota
  • Patent number: 9400498
    Abstract: An unmanned systems operator control system includes a hand held controller with a set of switches and control enumeration software specially configured to report a superset of virtual switches based on the physical switches. A core unit includes a first unmanned system control application subscribing to a first switch subset of the superset and outputting commands controlling a first unmanned system based on activation of the set of switches. A second unmanned system control application subscribes to a second switch subset of the superset and outputs commands controlling a second unmanned system based on activation of the set of switches. A mode switching subsystem is configured, in a first state, to map the set of switches to the first switch subset and, in a second state, to map the set of switches to the second switch subset.
    Type: Grant
    Filed: March 18, 2015
    Date of Patent: July 26, 2016
    Assignee: Foster-MIller, Inc.
    Inventors: Kurt Bruck, Boian Bentchev, Julie Shapiro, Todd Graham, Daniel Deguire
  • Patent number: 9358681
    Abstract: A work hanging apparatus includes a hanger line continuously conveying hangers each having a hook, a robot that has a hand with which a work having a hole is held and transfers the held work to a hanging location set in the hanger line, a controller controlling a movement of the hand to catch the hook of one of the hangers with the hole of the held work at the hanging location, a hole deviation detector that detects a positional deviation of the hole of the work, an attitude deviation detector that detects an attitudinal deviation of the hanger, and a corrector that corrects the movement of the hand according to the positional and attitudinal deviations.
    Type: Grant
    Filed: March 7, 2013
    Date of Patent: June 7, 2016
    Assignee: NHK Spring Co., Ltd.
    Inventors: Kotaro Nukui, Takashi Yajima
  • Patent number: 9333648
    Abstract: A robot control method and apparatus estimates an error based on information of an object obtained using a motor drive current sensor, a force sensor or a tactile sensor mounted at a robot hand and information of the object obtained using an optical sensor and compensates for the error. The robot control method and apparatus includes measuring information of an object using an optical sensor or calculating information of the object input by a user, moving a robot hand to the object based on the information of the object, controlling the thumb and fingers of the robot hand to contact the object based on the information of the object, determining whether the thumb and fingers have contacted the object through a motor drive current sensor, a force sensor or a tactile sensor, and grasping the object depending upon whether the thumb and the fingers have contacted the object.
    Type: Grant
    Filed: October 18, 2011
    Date of Patent: May 10, 2016
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Ji Young Kim, Yong Jae Kim, Kyung Shik Roh, Young Bo Shim
  • Patent number: 9333435
    Abstract: A customized 3D printing robot that is configured such that components of the robot are produced using a 3D printing technology. In particular, the customized 3D printing robot enables a user to directly select an appearance design for each component and assemble the components so that an entire appearance design of the robot can be customized.
    Type: Grant
    Filed: December 30, 2014
    Date of Patent: May 10, 2016
    Inventor: Na-young Kim
  • Patent number: 9332219
    Abstract: The present invention relates to a telepresence device that is capable of enabling various types of verbal or non-verbal interaction between a remote user and a local user. In accordance with an embodiment, the telepresence device may include a camera combined with direction control means; a projector provided on a top of the camera, and configured such that a direction of the projector is controlled by the direction control means along with a direction of the camera; and a control unit configured to control the direction of the camera by operating the direction control means, to extract an object, at which a remote user gazes, from an image acquired by the camera whose direction has been controlled, to generate a projection image related to the object, and to project the projection image around the object by controlling the projector.
    Type: Grant
    Filed: November 18, 2013
    Date of Patent: May 3, 2016
    Inventors: Jounghuem Kwon, Bumjae You, Shinyoung Kim, Kwangkyu Lee
  • Patent number: 9302391
    Abstract: An object gripping apparatus includes an image capturing unit for capturing a region including a plurality of works, an obtaining unit for obtaining distance information of the region, a measurement unit for measuring three-dimensional positions/orientations of a plurality of gripping-candidate works out of the plurality of works based on the image and distance information, thereby generating three-dimensional position/orientation information, a selection unit for selecting a gripping-target work based on the three-dimensional position/orientation information, a gripping unit for gripping the gripping-target work, and an updating unit for updating the three-dimensional position/orientation information by measuring three-dimensional positions/orientations of the gripping-candidate works at a time interval during gripping of the gripping-target work.
    Type: Grant
    Filed: December 6, 2011
    Date of Patent: April 5, 2016
    Inventors: Yuichiro Iio, Yusuke Mitarai
  • Patent number: 9299118
    Abstract: Method and apparatus for measuring dimensionality of a component are disclosed herein. At least a first image may be collected from a first camera of an aperture or a first precision countersink of the component. The first camera may be in a first position and may operate in one or more luminance environments. At least a second image may be collected from a second camera of the aperture or a second precision countersink of the component. The second camera may be in a second position and may operate in an alternative lighting environment to differentiate characteristic features including hole and countersink against their intersection with the component. Positional and process refinement parameters may be calculated for, and provided to, one or more numerically controlled mechanisms, based on the collected first and second images.
    Type: Grant
    Filed: April 18, 2012
    Date of Patent: March 29, 2016
    Assignee: The Boeing Company
    Inventor: Michael D. McGraw