Patents Examined by Harry Y Oh
  • Patent number: 11709491
    Abstract: In some implementations, a UAV flight system can dynamically adjust UAV flight operations based on radio frequency (RF) signal data. For example, the flight system can determine an initial flight plan for inspecting a RF transmitter and configure a UAV to perform an aerial inspection of the RF transmitter. Once airborne, the UAV can collect RF signal data and the flight system can automatically adjust the flight plan to avoid RF signal interference and/or damage to the UAV based on the collected RF signal data. In some implementations, the UAV can collect RF signal data and generate a three-dimensional received signal strength map that describes the received signal strength at various locations within a volumetric area around the RF transmitter. In some implementations, the UAV can collect RF signal data and determine whether a RF signal transmitter is properly aligned.
    Type: Grant
    Filed: September 17, 2021
    Date of Patent: July 25, 2023
    Assignee: Skydio, Inc.
    Inventors: Bernard J. Michini, Fabien Blanc-Paques, Logan Kaminski
  • Patent number: 11701782
    Abstract: Disclosed is a moving robot including: a voice input unit configured to receive a voice input of a user; a first display capable of receiving a touch input; a second display larger than the first display; and a controller configured to perform control such that a screen to be displayed in response to the voice input or the touch input is displayed on at least one of the first display or the second display based on a type and an amount of information included in the screen, and accordingly, it is possible to provide information and services more effectively using the two displays.
    Type: Grant
    Filed: June 28, 2018
    Date of Patent: July 18, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Eunji Roh, Sunju Lee, Jina Lee
  • Patent number: 11701773
    Abstract: Training and/or using a recurrent neural network model for visual servoing of an end effector of a robot. In visual servoing, the model can be utilized to generate, at each of a plurality of time steps, an action prediction that represents a prediction of how the end effector should be moved to cause the end effector to move toward a target object. The model can be viewpoint invariant in that it can be utilized across a variety of robots having vision components at a variety of viewpoints and/or can be utilized for a single robot even when a viewpoint, of a vision component of the robot, is drastically altered. Moreover, the model can be trained based on a large quantity of simulated data that is based on simulator(s) performing simulated episode(s) in view of the model. One or more portions of the model can be further trained based on a relatively smaller quantity of real training data.
    Type: Grant
    Filed: December 4, 2018
    Date of Patent: July 18, 2023
    Assignee: GOOGLE LLC
    Inventors: Alexander Toshev, Fereshteh Sadeghi, Sergey Levine
  • Patent number: 11697211
    Abstract: A mobile robot operation method according to an aspect of the present invention includes: a step for receiving a guidance destination input; a step for generating a global path to the received guidance destination; a step for generating a left travel guideline and a right travel guideline on the left side and the right side of the generated global path; and a step for generating a local path within a travelable range between the left travel guideline and the right travel guideline. Accordingly, the robot operation method may generate a safe and optimal guidance path when providing a guidance service.
    Type: Grant
    Filed: June 12, 2018
    Date of Patent: July 11, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Joongtae Park, Hyoungrock Kim
  • Patent number: 11698263
    Abstract: A navigational system for a host vehicle may comprise at least one processing device. The processing device may be programmed to receive a first output and a second output associated with the host vehicle and identify a representation of a target object in the first output. The processing device may determine whether a characteristic of the target object triggers a navigational constraint by verifying the identification of the target object based on the first output and, if the at least one navigational constraint is not verified based on the first output, then verifying the identification of the target object based on a combination of the first output and the second output. In response to the verification, the processing device may cause at least one navigational change to the host vehicle.
    Type: Grant
    Filed: June 26, 2020
    Date of Patent: July 11, 2023
    Assignee: MOBILEYE VISION TECHNOLOGIES LTD.
    Inventors: Amnon Shashua, Shai Shalev-Shwartz, Shaked Shammah
  • Patent number: 11690690
    Abstract: A robotic surgical system for treating a patient is disclosed including a surgical tool movable relative to the patient and a user input device configured to remotely control the surgical tool. The surgical tool includes a shaft and an end effector. The user input device includes a base and a controller movable to effect a first control motion a second control motion. The controller includes a first accessibility mode and a second accessibility mode. The robotic surgical system further includes a control circuit configured to receive a motion control signal from the user input device, determine a controller accessibility mode, permit the first control motion in response to the motion control signal in the first accessibility mode and in the second accessibility mode and permit the second control motion in response to the motion control signal in the second accessibility mode but not the first accessibility mode.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: July 4, 2023
    Assignee: Cilag GmbH International
    Inventors: Clinton W. Denlinger, Gregory W. Johnson, Charles J. Scheib, Jeffrey S. Swayze, Joshua D. Young, Benjamin D. Dickerson
  • Patent number: 11693415
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating cut-in probabilities of agents surrounding a vehicle. One of the methods includes obtaining agent trajectory data for one or more agents in an environment; obtaining vehicle trajectory data of a vehicle in the environment; and processing a network input generated from the agent trajectory data and vehicle trajectory data using a neural network to generate a cut-in output, wherein the cut-in output comprises respective cut-in probabilities for each of a plurality of locations in the environment, wherein the respective cut-in probability for each location that is a current location of one of the one or more agents characterizes a likelihood that the agent in the current location will intersect with a planned future location of the vehicle within a predetermined amount of time.
    Type: Grant
    Filed: November 6, 2019
    Date of Patent: July 4, 2023
    Assignee: Waymo LLC
    Inventors: Khaled Refaat, Chi Pang Lam
  • Patent number: 11690677
    Abstract: A robotic surgical system includes an eye gaze sensing system in conjunction with a visual display of a camera image from a surgical work site. Detected gaze of a surgeon towards the display is used as input to the system. This input may be used by the system to assign an instrument to a control input device (when the user is prompted to look at the instrument), or it may be used as input to a computer vision algorithm to aid in object differentiation and seeding information, facilitating identification/differentiation of instruments, anatomical features or regions.
    Type: Grant
    Filed: December 31, 2018
    Date of Patent: July 4, 2023
    Assignee: Asensus Surgical US, Inc.
    Inventor: Kevin Andrew Hufford
  • Patent number: 11691277
    Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.
    Type: Grant
    Filed: July 19, 2021
    Date of Patent: July 4, 2023
    Assignee: X DEVELOPMENT LLC
    Inventors: Umashankar Nagarajan, Bianca Homberg
  • Patent number: 11690685
    Abstract: A handheld user interface device for controlling a robotic system may include a member, a housing at least partially disposed around the member and configured to be held in the hand of a user, and a tracking sensor system disposed on the member and configured to detect at least one of position and orientation of at least a portion of the device. At least one of the detected position of the portion of the device and detected orientation of the portion of the device is correlatable to a control of the robotic system.
    Type: Grant
    Filed: July 26, 2021
    Date of Patent: July 4, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Joan Savall, Allegra Anna Lenta Shum, Jose Luis Cordoba, Yiqi Zeng
  • Patent number: 11687084
    Abstract: A computer-implemented method for determining a control trajectory for a robotic device. The method includes: performing an information theoretic model predictive control applying a control trajectory sample prior in each time step to obtain a control trajectory for a given time horizon; determining the control trajectory sample prior depending on a data-driven trajectory prediction model which is trained to output a control trajectory sample as the control trajectory sample prior based on an actual state of the robotic device.
    Type: Grant
    Filed: July 29, 2021
    Date of Patent: June 27, 2023
    Assignee: ROBERT BOSCH GMBH
    Inventors: Andrey Rudenko, Luigi Palmieri, Kai Oliver Arras
  • Patent number: 11672193
    Abstract: A method for the operation of a self-propelled agricultural working machine has at least one working element and a driver assistance system for generating control actions within the working machine. A sensor arrangement for generating surroundings information is provided, and the driver assistance system generates the control actions based on the surroundings information. The sensor arrangement comprises a camera-based sensor system and a laser-based sensor system, each of which generates sensor information regarding a predetermined, relevant surroundings area of the working machine. The sensor information of the camera-based sensor system is present as starting camera images. The starting camera images are segmented into image segments by an image processing system according to a segmentation rule, and the segmented camera images are combined by a sensor fusion module with the sensor information from the laser-based sensor system.
    Type: Grant
    Filed: September 28, 2018
    Date of Patent: June 13, 2023
    Assignee: CLAAS E-SYSTEMS GMBH
    Inventors: Dennis Neitemeier, Boris Kettelhoit, Andreas Skiba, Thilo Krause
  • Patent number: 11667034
    Abstract: Computerized system and method are provided. A robotic manipulator (12) is arranged to grasp objects (20). A gripper (16) is attached to robotic manipulator (12), which includes an imaging sensor (14). During motion of robotic manipulator (12), imaging sensor (14) is arranged to capture images providing different views of objects in the environment of the robotic manipulator. A processor (18) is configured to find, based on the different views, candidate grasp locations and trajectories to perform a grasp of a respective object in the environment of the robotic manipulator. Processor (18) is configured to calculate respective values indicative of grasp quality for the candidate grasp locations, and, based on the calculated respective values indicative of grasp quality for the candidate grasp locations, processor (18) is configured to select a grasp location likely to result in a successful grasp of the respective object.
    Type: Grant
    Filed: February 12, 2020
    Date of Patent: June 6, 2023
    Assignee: Siemens Aktiengesellschaft
    Inventors: Heiko Claussen, Martin Sehr, Eugen Solowjow, Chengtao Wen, Juan L. Aparicio Ojea
  • Patent number: 11667301
    Abstract: A system performs modeling and simulation of non-stationary traffic entities for testing and development of modules used in an autonomous vehicle system. The system uses a machine learning based model that predicts hidden context attributes for traffic entities that may be encountered by a vehicle in traffic. The system generates simulation data for testing and development of modules that help navigate autonomous vehicles. The generated simulation data may be image or video data including representations of traffic entities, for example, pedestrians, bicyclists, and other vehicles. The system may generate simulation data using generative adversarial neural networks.
    Type: Grant
    Filed: December 10, 2019
    Date of Patent: June 6, 2023
    Assignee: Perceptive Automata, Inc.
    Inventors: Kshitij Misra, Samuel English Anthony
  • Patent number: 11664265
    Abstract: In an embodiment, a robotic arm includes: a base; at least one link secured to the base; a gripper secured to the at least one link, wherein: the gripper comprises a finger, the gripper is configured to secure a wafer while the at least one link is in motion, and the gripper is configured to release the wafer while the at least one link is stopped, a sensor disposed on the finger, the sensor configured to collect sensor data characterizing the robotic arm's interaction with a semiconductor processing chamber while the wafer is secured using the finger.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: May 30, 2023
    Assignee: Taiwan Semiconductor Manufacturing Co., Ltd.
    Inventors: Yan-Hong Liu, Ming-Feng Chen, Che-fu Chen, Hung-Wen Chen
  • Patent number: 11650591
    Abstract: This specification describes trajectory planning for robotic devices. A robotic navigation system can obtain, for each of multiple time steps, data representing an environment of a robot at the time step. The system generates a series of occupancy maps for the multiple time steps, and uses the series of occupancy maps to determine occupancy predictions for one or more future time steps. Each occupancy prediction can identify predicted locations of obstacles in the environment of the robot at a different one of the future time steps. A planned trajectory can be determined for the robot using the occupancy predictions, and the robot initiates travel along the planned trajectory.
    Type: Grant
    Filed: May 20, 2021
    Date of Patent: May 16, 2023
    Assignee: X Development LLC
    Inventor: David Millard
  • Patent number: 11641803
    Abstract: The invention relates to a method and system for picking up and collecting plant matter, in particular plant embryos. To pick up the plant matter, a pick-up unit is used that is mounted to a robotic arm. According to the invention, two separate imaging steps are performed at two different positions of the pick-up unit. The first imaging step is performed to identify an isolated piece of plant matter. The second imaging step is performed when the pick-up unit is at a confirming position and enables a verification of whether a piece of plant matter has been picked up or not. The confirming position is in between the position of the pick-up unit for picking up plant matter and the position for depositing plant matter in suitable receptacles.
    Type: Grant
    Filed: April 8, 2019
    Date of Patent: May 9, 2023
    Assignee: RIJK ZWAAN ZAADTEELT EN ZAADHANDEL B.V.
    Inventors: Wilhelmus Petrus Adrianus Roeland Voermans, Mark Van Den Berg, Kevin Cornelis Adrianus Gerardus Verbocht
  • Patent number: 11642798
    Abstract: Disclosed are a method and a system for charging a robot. A method for charging a robot according to an embodiment of the present disclosure includes monitoring a battery level of a first robot which is providing a service, determining a charging robot for charging the first robot, from a plurality of second robots, when a battery level of the first robot falls below a first threshold level, and transmitting an instruction to move to a target position to the determined charging robot, in which determining the charging robot comprises determining the charging robot based at least partly on distances between the first robot and the second robots and battery levels of the second robots. Embodiments of the present disclosure may be implemented by executing an artificial intelligence algorithm and/or a machine learning algorithm in a 5G environment connected for Internet of Things.
    Type: Grant
    Filed: January 15, 2020
    Date of Patent: May 9, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Jae Ho Kwak, Won Hong Jeong
  • Patent number: 11634135
    Abstract: A device of the present disclosure controls a vehicle speed of a vehicle such that the vehicle speed matches a set vehicle speed when a preceding vehicle is not detected, and controls the vehicle speed of the vehicle such that the vehicle follows the preceding vehicle when the preceding vehicle is detected. The device controls the vehicle speed of the vehicle assuming that the preceding vehicle is not detected, when a condition that allows the vehicle to overtake the two-wheeled vehicle is satisfied and an intention of a driver to overtake a two-wheeled vehicle is detected, in a case where the two-wheeled vehicle is detected as the preceding vehicle and the vehicle speed is controlled such that the vehicle follows the two-wheeled vehicle.
    Type: Grant
    Filed: April 15, 2021
    Date of Patent: April 25, 2023
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventor: Harue Matsumoto
  • Patent number: 11633855
    Abstract: To provide a robot controller and a robot control method that do not need a logic command to be associated with a teaching position for a robot, and that are thus capable of executing the logic command at a desired position and a desired timing. A robot controller includes: an operation command interpretation unit that interprets an operation command program describing a teaching operation and a teaching position for a robot, and that generates an operation command; a logic command interpretation unit that interprets a logic command program describing a logic command instructing a machining process to be performed by the robot and an execution position for the logic command, independently from the teaching operation and the teaching position, and that generates the logic command that includes the execution position; and a command execution unit that executes the operation command and the logic command.
    Type: Grant
    Filed: September 15, 2020
    Date of Patent: April 25, 2023
    Assignee: FANUC CORPORATION
    Inventor: Takahiro Tanaka