Patents Examined by Sohana Tanju Khayer
  • Patent number: 11839982
    Abstract: Various systems and methods for controlling a target device are disclosed. For example, a system includes a user computing device including a user interface, a user motion database communicatively coupled to the user computing device, a controller, a controller motion database communicatively coupled to the controller, and a target device communicatively coupled to the controller. The user computing device can be configured to connect the user motion database and the controller motion database to share corresponding sets of motion instructions in real-time in response to receiving a sync indication from the user computing device. The target device can be configured to implement the corresponding sets of motion instructions in real-time on the target device.
    Type: Grant
    Filed: November 10, 2022
    Date of Patent: December 12, 2023
    Assignee: Animax Designs, Inc.
    Inventor: Harold Fred Bufford, Jr.
  • Patent number: 11837100
    Abstract: The invention relates to operation of unmanned aircraft systems (UAS) in the National Airspace System (NAS) using Remote Identification (Remote ID) and tracking. Specifically the invention relates to pre-flight initialization and programming of UAS with personality data needed to implement Remote ID functionality. The personality data includes UAS identity and configuration data. The pre-flight programming process promotes mission flexibility and efficient pre-flight preparation by reusing mission to mission common configuration and identity data elements, reprogramming only delta data. Operating modes may be used to easily reconfigure programming data to common mission specific requirements.
    Type: Grant
    Filed: June 27, 2020
    Date of Patent: December 5, 2023
    Assignee: RUMFERT, LLC
    Inventor: David Dale Jensen
  • Patent number: 11835958
    Abstract: Methods and systems for training a motion planner for an autonomous vehicle are described. A trajectory evaluator agent of the motion planner receives state data defining a current state of the autonomous vehicle and an environment at a current time step. Based on the current state, a trajectory is selected. A reward is calculated based on performance of the selected trajectory in the current state. State data is received for a next state of the autonomous vehicle and the environment at a next time step. Parameters of the trajectory evaluator agent are updated based on the current state, selected trajectory, computed reward and next state. The parameters of the trajectory evaluator agent are updated to assign an evaluation value for the selected trajectory that reflects the calculated reward and expected performance of the selected trajectory in the future states.
    Type: Grant
    Filed: July 28, 2020
    Date of Patent: December 5, 2023
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Kasra Rezaee, Peyman Yadmellat
  • Patent number: 11821733
    Abstract: A terrain referenced navigation system for a vehicle is disclosed and includes an inertial measurement unit and one or more generic terrain sensors configured to collect terrain-dependent data. The terrain referenced navigation system includes one or more processors in electronic communication with the generic terrain sensors and the inertial measurement unit, and a memory coupled to the processors. The memory stores data into one or more databases and program code that, when executed by the processors, causes the terrain referenced navigation system to determine a predicted terrain value based on a terrain value, where the terrain value is retrieved from one or more sensor specific terrain databases. The pre-Kalman filter processing values are sent to a Kalman filter. The Kalman filter determines navigation corrections and sensor corrections based on the pre-Kalman filter processing values.
    Type: Grant
    Filed: January 21, 2020
    Date of Patent: November 21, 2023
    Assignee: The Boeing Company
    Inventors: Rongsheng Li, Chang Jin Yoo, Cody Gruebele
  • Patent number: 11822344
    Abstract: A method and corresponding device for the decentralized cooperative coordination of at least two vehicles includes: determining an abstract driving maneuver to realize at least one nominal driving maneuver by a first vehicle, determining driving maneuver information regarding the first vehicle's at least one abstract driving maneuver, transmitting the first vehicle's driving maneuver information to at least one further vehicle, calculating or reconstructing the first vehicle's nominal trajectory and/or nominal driving maneuver based on the received driving maneuver information by the further vehicle, comparing the first vehicle's reconstructed nominal trajectory and/or the reconstructed nominal driving maneuver with the further vehicle's nominal trajectory and/or nominal driving maneuver and, if a conflict exists between the nominal trajectory and/or the nominal driving maneuver of the first vehicle and the nominal trajectory and/or the nominal driving maneuver of the further vehicle, transmitting a cooperati
    Type: Grant
    Filed: January 28, 2019
    Date of Patent: November 21, 2023
    Inventors: Sebastian Strunck, Thomas Grotendorst, Jonas Schönichen
  • Patent number: 11819288
    Abstract: A surgical robotic system senses position or orientation of an object, which may be a trocar that has a magnetic field. Magnetic field sensors are coupled to a surgical robotic arm. A machine learning model coupled to the magnetic field sensors is trained to output three-dimensional position and/or three-dimensional orientation of the trocar or other object. Other aspects are also described.
    Type: Grant
    Filed: March 19, 2020
    Date of Patent: November 21, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Bernhard A. Fuerst, Dennis Moses, Pablo Garcia Kilroy
  • Patent number: 11806100
    Abstract: A robotic surgical system for performing surgery, the system includes a robotic arm having a force and/or torque control sensor coupled to the end-effector and configured to hold a first surgical tool. The robotic system further includes an actuator that includes controlled movement of the robotic arm and/or positioning of the end-effector. The system further includes a tracking detector having optical markers for real time detection of (i) surgical tool position and/or end-effector position and (ii) patient position. The system also includes a feedback system for moving the end effector to a planned trajectory based on the threshold distance between the planned trajectory and the actual trajectory.
    Type: Grant
    Filed: June 21, 2021
    Date of Patent: November 7, 2023
    Inventor: Szymon Kostrzewski
  • Patent number: 11794350
    Abstract: A controller is provided for interactive classification and recognition of an object in a scene using tactile feedback. The controller includes an interface configured to transmit and receive the control, sensor signals from a robot arm, gripper signals from a gripper attached to the robot arm, tactile signals from sensors attached to the gripper and at least one vision sensor, a memory module to store robot control programs, and a classifier and recognition model, and a processor to generate control signals based on the control program and a grasp pose on the object, configured to control the robot arm to grasp the object with the gripper.
    Type: Grant
    Filed: October 22, 2020
    Date of Patent: October 24, 2023
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Radu Ioan Corcodel, Siddarth Jain, Jeroen van Baar
  • Patent number: 11794345
    Abstract: A robotic system comprising a master robotic system, and a first robotic system comprising a first mobile platform operable to move about a surface, and comprising a first manipulator. The robotic system can comprise a second robotic system comprising a second mobile platform operable to move about the surface, and comprising a second manipulator. A control module can be associated with the master robotic system and the first and second robotic systems, and can be operable in a paired control mode to facilitate paired control of the first and second robotic systems to move about the ground surface, and operable in an unpaired control mode to facilitate non-paired control of a selected one of the first or second robotic systems.
    Type: Grant
    Filed: December 31, 2020
    Date of Patent: October 24, 2023
    Assignee: Sarcos Corp.
    Inventors: Fraser M. Smith, Marc X. Olivier
  • Patent number: 11797021
    Abstract: A robot includes a driver; a camera; and a processor configured to: during an interaction session in which a first user identified in an image obtained through the camera is set as an interaction subject, perform an operation corresponding to a user command received from the first user, and determine whether interruption by a second user identified in an image obtained through the camera occurs, and based on determining that the interruption by the second user occurred, control the driver such that the robot performs a feedback motion for the interruption.
    Type: Grant
    Filed: December 16, 2020
    Date of Patent: October 24, 2023
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Jaemin Chun, Youngsun Kim, Minseok Han, Segwon Han
  • Patent number: 11766779
    Abstract: Disclosed is a moving robot capable of recognizing a waiting line and a method for controlling the same. One embodiment provides a method for operating a moving robot, the method comprising: starting moving from a predefined moving start point toward a predefined moving end point; acquiring a waiting line region image by photographing a predefined waiting line region during the moving; searching for an end point of a waiting line formed in the waiting line region using the waiting line region image; terminating the moving when the end point of the waiting line is detected; setting an operation mode based on a length of the waiting line calculated using the end point of the waiting line; and operating in the set operation mode while returning to the moving start point.
    Type: Grant
    Filed: May 4, 2020
    Date of Patent: September 26, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Kang Uk Kim, Minjung Kim, Yeonsoo Kim, Hyoungrock Kim, Hyoung Seok Kim, Dong Ki Noh
  • Patent number: 11738468
    Abstract: A robot system includes a robot configured to perform a work to a workpiece, and a user interface configured to remotely manipulate the robot. The robot includes a robotic arm, a robot hand attached to the robotic arm and configured to perform the work to the workpiece, and an acceleration sensor attached to the robot hand. The robot system further includes a speaker configured to output an acceleration signal from the acceleration sensor as perceptual information.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: August 29, 2023
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Yasuhiko Hashimoto, Nobuyasu Shimomura, Masayuki Kamon, Hideyuki Ryu
  • Patent number: 11731283
    Abstract: A method for checking a safety area of a robot with an augmented reality human machine interface (AR-HMI) that comprises a display and a video camera. The method includes: acquiring, at the AR-HMI, a robot type of the robot, displaying, in the display of the AR-HMI, a virtual robot image of at least part of a robot of the robot type in a manner such that the virtual robot image overlays an actual robot image of the robot of the robot type in the video camera of the AR-HMI, aligning a position of the virtual robot image with a position of the actual robot image by moving the AR-HMI in three-dimensional space, confirming the alignment of the position between the virtual robot image and the actual robot image, and displaying a virtual first safety cell area around the virtual robot image in the confirmed position as an overlay of the actual robot image in the display of the AR-HMI.
    Type: Grant
    Filed: January 18, 2021
    Date of Patent: August 22, 2023
    Assignee: MAGNA STEYR FAHRZEUGTECHNIK GMBH & CO KG
    Inventors: Andreas Huber, Tomaz Kukovec, Christoph Monschein
  • Patent number: 11697211
    Abstract: A mobile robot operation method according to an aspect of the present invention includes: a step for receiving a guidance destination input; a step for generating a global path to the received guidance destination; a step for generating a left travel guideline and a right travel guideline on the left side and the right side of the generated global path; and a step for generating a local path within a travelable range between the left travel guideline and the right travel guideline. Accordingly, the robot operation method may generate a safe and optimal guidance path when providing a guidance service.
    Type: Grant
    Filed: June 12, 2018
    Date of Patent: July 11, 2023
    Assignee: LG ELECTRONICS INC.
    Inventors: Joongtae Park, Hyoungrock Kim
  • Patent number: 11691277
    Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.
    Type: Grant
    Filed: July 19, 2021
    Date of Patent: July 4, 2023
    Assignee: X DEVELOPMENT LLC
    Inventors: Umashankar Nagarajan, Bianca Homberg
  • Patent number: 11693415
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating cut-in probabilities of agents surrounding a vehicle. One of the methods includes obtaining agent trajectory data for one or more agents in an environment; obtaining vehicle trajectory data of a vehicle in the environment; and processing a network input generated from the agent trajectory data and vehicle trajectory data using a neural network to generate a cut-in output, wherein the cut-in output comprises respective cut-in probabilities for each of a plurality of locations in the environment, wherein the respective cut-in probability for each location that is a current location of one of the one or more agents characterizes a likelihood that the agent in the current location will intersect with a planned future location of the vehicle within a predetermined amount of time.
    Type: Grant
    Filed: November 6, 2019
    Date of Patent: July 4, 2023
    Assignee: Waymo LLC
    Inventors: Khaled Refaat, Chi Pang Lam
  • Patent number: 11691291
    Abstract: Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.
    Type: Grant
    Filed: November 27, 2020
    Date of Patent: July 4, 2023
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Woo-Ri Ko, Do-Hyung Kim, Jae-Hong Kim, Young-Woo Yoon, Jae-Yeon Lee, Min-Su Jang
  • Patent number: 11685051
    Abstract: A robotic assembly for servicing equipment, the robotic assembly including an area configured to receive components associated with a workscope of the equipment; an environmental capture device configured to capture information associated with an environment in which the robotic assembly is disposed; and one or more computing devices configured to: locate the equipment in the environment; autonomously navigate the robotic assembly through the environment to the equipment; and autonomously adjust a position of the robotic assembly in response to the workscope.
    Type: Grant
    Filed: October 29, 2020
    Date of Patent: June 27, 2023
    Assignees: General Electric Company, Oliver Crispin Robotics Limited
    Inventors: Andrew Crispin Graham, David Scott Diwinsky
  • Patent number: 11687084
    Abstract: A computer-implemented method for determining a control trajectory for a robotic device. The method includes: performing an information theoretic model predictive control applying a control trajectory sample prior in each time step to obtain a control trajectory for a given time horizon; determining the control trajectory sample prior depending on a data-driven trajectory prediction model which is trained to output a control trajectory sample as the control trajectory sample prior based on an actual state of the robotic device.
    Type: Grant
    Filed: July 29, 2021
    Date of Patent: June 27, 2023
    Assignee: ROBERT BOSCH GMBH
    Inventors: Andrey Rudenko, Luigi Palmieri, Kai Oliver Arras
  • Patent number: 11687074
    Abstract: The present disclosure relates to technology that controls a remote moving body based on collaboration between the moving body and human, and a method for controlling a moving body includes acquiring a first biosignal indicating an intention to start operation of the moving body from a user, operating the moving body, determining a surrounding situation of the moving body that autonomously controls the driving, providing the user with surrounding information of the moving body for inducing path setting, acquiring a second biosignal evoked by recognition of the surrounding information from the user, setting a driving direction of the moving body, commanding the moving body to automatically perform a driving operation to be carried out in the set driving direction, and acquiring a third biosignal responsive to recognition of a driving error from the user and correcting the driving direction of the moving body to induce driving path resetting.
    Type: Grant
    Filed: September 18, 2020
    Date of Patent: June 27, 2023
    Assignee: Korea Institute of Science and Technology
    Inventors: Laehyun Kim, Da-Hye Kim, Seung-jun Oh, Eon Jo Hong