Patents Examined by Sohana Tanju Khayer
-
Patent number: 11822344Abstract: A method and corresponding device for the decentralized cooperative coordination of at least two vehicles includes: determining an abstract driving maneuver to realize at least one nominal driving maneuver by a first vehicle, determining driving maneuver information regarding the first vehicle's at least one abstract driving maneuver, transmitting the first vehicle's driving maneuver information to at least one further vehicle, calculating or reconstructing the first vehicle's nominal trajectory and/or nominal driving maneuver based on the received driving maneuver information by the further vehicle, comparing the first vehicle's reconstructed nominal trajectory and/or the reconstructed nominal driving maneuver with the further vehicle's nominal trajectory and/or nominal driving maneuver and, if a conflict exists between the nominal trajectory and/or the nominal driving maneuver of the first vehicle and the nominal trajectory and/or the nominal driving maneuver of the further vehicle, transmitting a cooperatiType: GrantFiled: January 28, 2019Date of Patent: November 21, 2023Inventors: Sebastian Strunck, Thomas Grotendorst, Jonas Schönichen
-
Patent number: 11806100Abstract: A robotic surgical system for performing surgery, the system includes a robotic arm having a force and/or torque control sensor coupled to the end-effector and configured to hold a first surgical tool. The robotic system further includes an actuator that includes controlled movement of the robotic arm and/or positioning of the end-effector. The system further includes a tracking detector having optical markers for real time detection of (i) surgical tool position and/or end-effector position and (ii) patient position. The system also includes a feedback system for moving the end effector to a planned trajectory based on the threshold distance between the planned trajectory and the actual trajectory.Type: GrantFiled: June 21, 2021Date of Patent: November 7, 2023Inventor: Szymon Kostrzewski
-
Patent number: 11797021Abstract: A robot includes a driver; a camera; and a processor configured to: during an interaction session in which a first user identified in an image obtained through the camera is set as an interaction subject, perform an operation corresponding to a user command received from the first user, and determine whether interruption by a second user identified in an image obtained through the camera occurs, and based on determining that the interruption by the second user occurred, control the driver such that the robot performs a feedback motion for the interruption.Type: GrantFiled: December 16, 2020Date of Patent: October 24, 2023Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Jaemin Chun, Youngsun Kim, Minseok Han, Segwon Han
-
Patent number: 11794350Abstract: A controller is provided for interactive classification and recognition of an object in a scene using tactile feedback. The controller includes an interface configured to transmit and receive the control, sensor signals from a robot arm, gripper signals from a gripper attached to the robot arm, tactile signals from sensors attached to the gripper and at least one vision sensor, a memory module to store robot control programs, and a classifier and recognition model, and a processor to generate control signals based on the control program and a grasp pose on the object, configured to control the robot arm to grasp the object with the gripper.Type: GrantFiled: October 22, 2020Date of Patent: October 24, 2023Assignee: Mitsubishi Electric Research Laboratories, Inc.Inventors: Radu Ioan Corcodel, Siddarth Jain, Jeroen van Baar
-
Patent number: 11794345Abstract: A robotic system comprising a master robotic system, and a first robotic system comprising a first mobile platform operable to move about a surface, and comprising a first manipulator. The robotic system can comprise a second robotic system comprising a second mobile platform operable to move about the surface, and comprising a second manipulator. A control module can be associated with the master robotic system and the first and second robotic systems, and can be operable in a paired control mode to facilitate paired control of the first and second robotic systems to move about the ground surface, and operable in an unpaired control mode to facilitate non-paired control of a selected one of the first or second robotic systems.Type: GrantFiled: December 31, 2020Date of Patent: October 24, 2023Assignee: Sarcos Corp.Inventors: Fraser M. Smith, Marc X. Olivier
-
Patent number: 11766779Abstract: Disclosed is a moving robot capable of recognizing a waiting line and a method for controlling the same. One embodiment provides a method for operating a moving robot, the method comprising: starting moving from a predefined moving start point toward a predefined moving end point; acquiring a waiting line region image by photographing a predefined waiting line region during the moving; searching for an end point of a waiting line formed in the waiting line region using the waiting line region image; terminating the moving when the end point of the waiting line is detected; setting an operation mode based on a length of the waiting line calculated using the end point of the waiting line; and operating in the set operation mode while returning to the moving start point.Type: GrantFiled: May 4, 2020Date of Patent: September 26, 2023Assignee: LG ELECTRONICS INC.Inventors: Kang Uk Kim, Minjung Kim, Yeonsoo Kim, Hyoungrock Kim, Hyoung Seok Kim, Dong Ki Noh
-
Patent number: 11738468Abstract: A robot system includes a robot configured to perform a work to a workpiece, and a user interface configured to remotely manipulate the robot. The robot includes a robotic arm, a robot hand attached to the robotic arm and configured to perform the work to the workpiece, and an acceleration sensor attached to the robot hand. The robot system further includes a speaker configured to output an acceleration signal from the acceleration sensor as perceptual information.Type: GrantFiled: May 16, 2018Date of Patent: August 29, 2023Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHAInventors: Yasuhiko Hashimoto, Nobuyasu Shimomura, Masayuki Kamon, Hideyuki Ryu
-
Patent number: 11731283Abstract: A method for checking a safety area of a robot with an augmented reality human machine interface (AR-HMI) that comprises a display and a video camera. The method includes: acquiring, at the AR-HMI, a robot type of the robot, displaying, in the display of the AR-HMI, a virtual robot image of at least part of a robot of the robot type in a manner such that the virtual robot image overlays an actual robot image of the robot of the robot type in the video camera of the AR-HMI, aligning a position of the virtual robot image with a position of the actual robot image by moving the AR-HMI in three-dimensional space, confirming the alignment of the position between the virtual robot image and the actual robot image, and displaying a virtual first safety cell area around the virtual robot image in the confirmed position as an overlay of the actual robot image in the display of the AR-HMI.Type: GrantFiled: January 18, 2021Date of Patent: August 22, 2023Assignee: MAGNA STEYR FAHRZEUGTECHNIK GMBH & CO KGInventors: Andreas Huber, Tomaz Kukovec, Christoph Monschein
-
Patent number: 11697211Abstract: A mobile robot operation method according to an aspect of the present invention includes: a step for receiving a guidance destination input; a step for generating a global path to the received guidance destination; a step for generating a left travel guideline and a right travel guideline on the left side and the right side of the generated global path; and a step for generating a local path within a travelable range between the left travel guideline and the right travel guideline. Accordingly, the robot operation method may generate a safe and optimal guidance path when providing a guidance service.Type: GrantFiled: June 12, 2018Date of Patent: July 11, 2023Assignee: LG ELECTRONICS INC.Inventors: Joongtae Park, Hyoungrock Kim
-
Patent number: 11691277Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: GrantFiled: July 19, 2021Date of Patent: July 4, 2023Assignee: X DEVELOPMENT LLCInventors: Umashankar Nagarajan, Bianca Homberg
-
Patent number: 11693415Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating cut-in probabilities of agents surrounding a vehicle. One of the methods includes obtaining agent trajectory data for one or more agents in an environment; obtaining vehicle trajectory data of a vehicle in the environment; and processing a network input generated from the agent trajectory data and vehicle trajectory data using a neural network to generate a cut-in output, wherein the cut-in output comprises respective cut-in probabilities for each of a plurality of locations in the environment, wherein the respective cut-in probability for each location that is a current location of one of the one or more agents characterizes a likelihood that the agent in the current location will intersect with a planned future location of the vehicle within a predetermined amount of time.Type: GrantFiled: November 6, 2019Date of Patent: July 4, 2023Assignee: Waymo LLCInventors: Khaled Refaat, Chi Pang Lam
-
Patent number: 11691291Abstract: Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.Type: GrantFiled: November 27, 2020Date of Patent: July 4, 2023Assignee: Electronics and Telecommunications Research InstituteInventors: Woo-Ri Ko, Do-Hyung Kim, Jae-Hong Kim, Young-Woo Yoon, Jae-Yeon Lee, Min-Su Jang
-
Patent number: 11687074Abstract: The present disclosure relates to technology that controls a remote moving body based on collaboration between the moving body and human, and a method for controlling a moving body includes acquiring a first biosignal indicating an intention to start operation of the moving body from a user, operating the moving body, determining a surrounding situation of the moving body that autonomously controls the driving, providing the user with surrounding information of the moving body for inducing path setting, acquiring a second biosignal evoked by recognition of the surrounding information from the user, setting a driving direction of the moving body, commanding the moving body to automatically perform a driving operation to be carried out in the set driving direction, and acquiring a third biosignal responsive to recognition of a driving error from the user and correcting the driving direction of the moving body to induce driving path resetting.Type: GrantFiled: September 18, 2020Date of Patent: June 27, 2023Assignee: Korea Institute of Science and TechnologyInventors: Laehyun Kim, Da-Hye Kim, Seung-jun Oh, Eon Jo Hong
-
Patent number: 11685051Abstract: A robotic assembly for servicing equipment, the robotic assembly including an area configured to receive components associated with a workscope of the equipment; an environmental capture device configured to capture information associated with an environment in which the robotic assembly is disposed; and one or more computing devices configured to: locate the equipment in the environment; autonomously navigate the robotic assembly through the environment to the equipment; and autonomously adjust a position of the robotic assembly in response to the workscope.Type: GrantFiled: October 29, 2020Date of Patent: June 27, 2023Assignees: General Electric Company, Oliver Crispin Robotics LimitedInventors: Andrew Crispin Graham, David Scott Diwinsky
-
Patent number: 11687084Abstract: A computer-implemented method for determining a control trajectory for a robotic device. The method includes: performing an information theoretic model predictive control applying a control trajectory sample prior in each time step to obtain a control trajectory for a given time horizon; determining the control trajectory sample prior depending on a data-driven trajectory prediction model which is trained to output a control trajectory sample as the control trajectory sample prior based on an actual state of the robotic device.Type: GrantFiled: July 29, 2021Date of Patent: June 27, 2023Assignee: ROBERT BOSCH GMBHInventors: Andrey Rudenko, Luigi Palmieri, Kai Oliver Arras
-
Patent number: 11673269Abstract: Disclosed herein are a method of identifying a dynamic obstacle and a robot implementing the same, wherein the robot configured to identify a dynamic obstacle may change mechanisms for identifying an obstacle in an image captured by a camera sensor in a first direction and for sensing the obstacle on the basis of a velocity of movement of the identified obstacle, a degree of congestion based on distribution of the obstacle and a velocity of movement of the robot, to generate a moving path of the robot.Type: GrantFiled: May 20, 2020Date of Patent: June 13, 2023Assignee: LG ELECTRONICS INC.Inventors: Minjung Kim, Seungmin Baek, Jungsik Kim
-
Patent number: 11667301Abstract: A system performs modeling and simulation of non-stationary traffic entities for testing and development of modules used in an autonomous vehicle system. The system uses a machine learning based model that predicts hidden context attributes for traffic entities that may be encountered by a vehicle in traffic. The system generates simulation data for testing and development of modules that help navigate autonomous vehicles. The generated simulation data may be image or video data including representations of traffic entities, for example, pedestrians, bicyclists, and other vehicles. The system may generate simulation data using generative adversarial neural networks.Type: GrantFiled: December 10, 2019Date of Patent: June 6, 2023Assignee: Perceptive Automata, Inc.Inventors: Kshitij Misra, Samuel English Anthony
-
Patent number: 11662726Abstract: Apparatuses and methods for movement control of a device are disclosed. The apparatus comprises memory for storing information of patterns covering sectors of the area indicating whether the patterns have at least one item relevant to movement in the area. The information has been configured based on determination of at least one pattern that has at least one item relevant to movement in the area, division of the determined at least one pattern into smaller patterns, determination of at least one of the smaller patterns with at least one item relevant to movement in the area, and repeat of the division until predefined smallest pattern size is reached.Type: GrantFiled: October 16, 2020Date of Patent: May 30, 2023Assignee: NOKIA SOLUTIONS AND NETWORKS OYInventors: Vladimir Bashkirov, Arto Kristian Suvitie
-
Patent number: 11654564Abstract: A grasp generation technique for robotic pick-up of parts. A database of solid or surface models is provided for all objects and grippers which are to be evaluated. A gripper is selected and a random initialization is performed, where random objects and poses are selected from the object database. An iterative optimization computation is then performed, where many hundreds of grasps are computed for each part with surface contact between the part and the gripper, and sampling for grasp diversity and global optimization. Finally, a physical environment simulation is performed, where the grasps for each part are mapped to simulated piles of objects in a bin scenario. The grasp points and approach directions from the physical environment simulation are then used to train neural networks for grasp learning in real-world robotic operations, where the simulation results are correlated to camera depth image data to identify a high quality grasp.Type: GrantFiled: September 10, 2020Date of Patent: May 23, 2023Assignee: FANUC CORPORATIONInventor: Yongxiang Fan
-
Patent number: 11657506Abstract: A method of robot autonomous navigation includes capturing an image of the environment, segmenting the captured image to identify one or more foreground objects and one or more background objects, determining a match between one or more of the foreground objects to one or more predefined image files, estimating an object pose for the one or more foreground objects by implementing an iterative estimation loop, determining a robot pose estimate by applying a robot-centric environmental model to the object pose estimate by implementing an iterative refinement loop, associating semantic labels to the matched foreground object, compiling a semantic map containing the semantic labels and segmented object image pose, and providing localization information to the robot based on the semantic map and the robot pose estimate. A system and a non-transitory computer-readable medium are also disclosed.Type: GrantFiled: March 6, 2019Date of Patent: May 23, 2023Assignee: General Electric CompanyInventors: Huan Tan, Isabella Heukensfeldt Jansen, Gyeong Woo Cheon, Li Zhang