Patents Examined by Sohana Tanju Khayer
-
Patent number: 11766779Abstract: Disclosed is a moving robot capable of recognizing a waiting line and a method for controlling the same. One embodiment provides a method for operating a moving robot, the method comprising: starting moving from a predefined moving start point toward a predefined moving end point; acquiring a waiting line region image by photographing a predefined waiting line region during the moving; searching for an end point of a waiting line formed in the waiting line region using the waiting line region image; terminating the moving when the end point of the waiting line is detected; setting an operation mode based on a length of the waiting line calculated using the end point of the waiting line; and operating in the set operation mode while returning to the moving start point.Type: GrantFiled: May 4, 2020Date of Patent: September 26, 2023Assignee: LG ELECTRONICS INC.Inventors: Kang Uk Kim, Minjung Kim, Yeonsoo Kim, Hyoungrock Kim, Hyoung Seok Kim, Dong Ki Noh
-
Patent number: 11738468Abstract: A robot system includes a robot configured to perform a work to a workpiece, and a user interface configured to remotely manipulate the robot. The robot includes a robotic arm, a robot hand attached to the robotic arm and configured to perform the work to the workpiece, and an acceleration sensor attached to the robot hand. The robot system further includes a speaker configured to output an acceleration signal from the acceleration sensor as perceptual information.Type: GrantFiled: May 16, 2018Date of Patent: August 29, 2023Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHAInventors: Yasuhiko Hashimoto, Nobuyasu Shimomura, Masayuki Kamon, Hideyuki Ryu
-
Patent number: 11731283Abstract: A method for checking a safety area of a robot with an augmented reality human machine interface (AR-HMI) that comprises a display and a video camera. The method includes: acquiring, at the AR-HMI, a robot type of the robot, displaying, in the display of the AR-HMI, a virtual robot image of at least part of a robot of the robot type in a manner such that the virtual robot image overlays an actual robot image of the robot of the robot type in the video camera of the AR-HMI, aligning a position of the virtual robot image with a position of the actual robot image by moving the AR-HMI in three-dimensional space, confirming the alignment of the position between the virtual robot image and the actual robot image, and displaying a virtual first safety cell area around the virtual robot image in the confirmed position as an overlay of the actual robot image in the display of the AR-HMI.Type: GrantFiled: January 18, 2021Date of Patent: August 22, 2023Assignee: MAGNA STEYR FAHRZEUGTECHNIK GMBH & CO KGInventors: Andreas Huber, Tomaz Kukovec, Christoph Monschein
-
Patent number: 11697211Abstract: A mobile robot operation method according to an aspect of the present invention includes: a step for receiving a guidance destination input; a step for generating a global path to the received guidance destination; a step for generating a left travel guideline and a right travel guideline on the left side and the right side of the generated global path; and a step for generating a local path within a travelable range between the left travel guideline and the right travel guideline. Accordingly, the robot operation method may generate a safe and optimal guidance path when providing a guidance service.Type: GrantFiled: June 12, 2018Date of Patent: July 11, 2023Assignee: LG ELECTRONICS INC.Inventors: Joongtae Park, Hyoungrock Kim
-
Patent number: 11691277Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: GrantFiled: July 19, 2021Date of Patent: July 4, 2023Assignee: X DEVELOPMENT LLCInventors: Umashankar Nagarajan, Bianca Homberg
-
Patent number: 11693415Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating cut-in probabilities of agents surrounding a vehicle. One of the methods includes obtaining agent trajectory data for one or more agents in an environment; obtaining vehicle trajectory data of a vehicle in the environment; and processing a network input generated from the agent trajectory data and vehicle trajectory data using a neural network to generate a cut-in output, wherein the cut-in output comprises respective cut-in probabilities for each of a plurality of locations in the environment, wherein the respective cut-in probability for each location that is a current location of one of the one or more agents characterizes a likelihood that the agent in the current location will intersect with a planned future location of the vehicle within a predetermined amount of time.Type: GrantFiled: November 6, 2019Date of Patent: July 4, 2023Assignee: Waymo LLCInventors: Khaled Refaat, Chi Pang Lam
-
Patent number: 11691291Abstract: Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.Type: GrantFiled: November 27, 2020Date of Patent: July 4, 2023Assignee: Electronics and Telecommunications Research InstituteInventors: Woo-Ri Ko, Do-Hyung Kim, Jae-Hong Kim, Young-Woo Yoon, Jae-Yeon Lee, Min-Su Jang
-
Patent number: 11687074Abstract: The present disclosure relates to technology that controls a remote moving body based on collaboration between the moving body and human, and a method for controlling a moving body includes acquiring a first biosignal indicating an intention to start operation of the moving body from a user, operating the moving body, determining a surrounding situation of the moving body that autonomously controls the driving, providing the user with surrounding information of the moving body for inducing path setting, acquiring a second biosignal evoked by recognition of the surrounding information from the user, setting a driving direction of the moving body, commanding the moving body to automatically perform a driving operation to be carried out in the set driving direction, and acquiring a third biosignal responsive to recognition of a driving error from the user and correcting the driving direction of the moving body to induce driving path resetting.Type: GrantFiled: September 18, 2020Date of Patent: June 27, 2023Assignee: Korea Institute of Science and TechnologyInventors: Laehyun Kim, Da-Hye Kim, Seung-jun Oh, Eon Jo Hong
-
Patent number: 11685051Abstract: A robotic assembly for servicing equipment, the robotic assembly including an area configured to receive components associated with a workscope of the equipment; an environmental capture device configured to capture information associated with an environment in which the robotic assembly is disposed; and one or more computing devices configured to: locate the equipment in the environment; autonomously navigate the robotic assembly through the environment to the equipment; and autonomously adjust a position of the robotic assembly in response to the workscope.Type: GrantFiled: October 29, 2020Date of Patent: June 27, 2023Assignees: General Electric Company, Oliver Crispin Robotics LimitedInventors: Andrew Crispin Graham, David Scott Diwinsky
-
Patent number: 11687084Abstract: A computer-implemented method for determining a control trajectory for a robotic device. The method includes: performing an information theoretic model predictive control applying a control trajectory sample prior in each time step to obtain a control trajectory for a given time horizon; determining the control trajectory sample prior depending on a data-driven trajectory prediction model which is trained to output a control trajectory sample as the control trajectory sample prior based on an actual state of the robotic device.Type: GrantFiled: July 29, 2021Date of Patent: June 27, 2023Assignee: ROBERT BOSCH GMBHInventors: Andrey Rudenko, Luigi Palmieri, Kai Oliver Arras
-
Patent number: 11673269Abstract: Disclosed herein are a method of identifying a dynamic obstacle and a robot implementing the same, wherein the robot configured to identify a dynamic obstacle may change mechanisms for identifying an obstacle in an image captured by a camera sensor in a first direction and for sensing the obstacle on the basis of a velocity of movement of the identified obstacle, a degree of congestion based on distribution of the obstacle and a velocity of movement of the robot, to generate a moving path of the robot.Type: GrantFiled: May 20, 2020Date of Patent: June 13, 2023Assignee: LG ELECTRONICS INC.Inventors: Minjung Kim, Seungmin Baek, Jungsik Kim
-
Patent number: 11667301Abstract: A system performs modeling and simulation of non-stationary traffic entities for testing and development of modules used in an autonomous vehicle system. The system uses a machine learning based model that predicts hidden context attributes for traffic entities that may be encountered by a vehicle in traffic. The system generates simulation data for testing and development of modules that help navigate autonomous vehicles. The generated simulation data may be image or video data including representations of traffic entities, for example, pedestrians, bicyclists, and other vehicles. The system may generate simulation data using generative adversarial neural networks.Type: GrantFiled: December 10, 2019Date of Patent: June 6, 2023Assignee: Perceptive Automata, Inc.Inventors: Kshitij Misra, Samuel English Anthony
-
Patent number: 11662726Abstract: Apparatuses and methods for movement control of a device are disclosed. The apparatus comprises memory for storing information of patterns covering sectors of the area indicating whether the patterns have at least one item relevant to movement in the area. The information has been configured based on determination of at least one pattern that has at least one item relevant to movement in the area, division of the determined at least one pattern into smaller patterns, determination of at least one of the smaller patterns with at least one item relevant to movement in the area, and repeat of the division until predefined smallest pattern size is reached.Type: GrantFiled: October 16, 2020Date of Patent: May 30, 2023Assignee: NOKIA SOLUTIONS AND NETWORKS OYInventors: Vladimir Bashkirov, Arto Kristian Suvitie
-
Patent number: 11654564Abstract: A grasp generation technique for robotic pick-up of parts. A database of solid or surface models is provided for all objects and grippers which are to be evaluated. A gripper is selected and a random initialization is performed, where random objects and poses are selected from the object database. An iterative optimization computation is then performed, where many hundreds of grasps are computed for each part with surface contact between the part and the gripper, and sampling for grasp diversity and global optimization. Finally, a physical environment simulation is performed, where the grasps for each part are mapped to simulated piles of objects in a bin scenario. The grasp points and approach directions from the physical environment simulation are then used to train neural networks for grasp learning in real-world robotic operations, where the simulation results are correlated to camera depth image data to identify a high quality grasp.Type: GrantFiled: September 10, 2020Date of Patent: May 23, 2023Assignee: FANUC CORPORATIONInventor: Yongxiang Fan
-
Patent number: 11657506Abstract: A method of robot autonomous navigation includes capturing an image of the environment, segmenting the captured image to identify one or more foreground objects and one or more background objects, determining a match between one or more of the foreground objects to one or more predefined image files, estimating an object pose for the one or more foreground objects by implementing an iterative estimation loop, determining a robot pose estimate by applying a robot-centric environmental model to the object pose estimate by implementing an iterative refinement loop, associating semantic labels to the matched foreground object, compiling a semantic map containing the semantic labels and segmented object image pose, and providing localization information to the robot based on the semantic map and the robot pose estimate. A system and a non-transitory computer-readable medium are also disclosed.Type: GrantFiled: March 6, 2019Date of Patent: May 23, 2023Assignee: General Electric CompanyInventors: Huan Tan, Isabella Heukensfeldt Jansen, Gyeong Woo Cheon, Li Zhang
-
Patent number: 11650591Abstract: This specification describes trajectory planning for robotic devices. A robotic navigation system can obtain, for each of multiple time steps, data representing an environment of a robot at the time step. The system generates a series of occupancy maps for the multiple time steps, and uses the series of occupancy maps to determine occupancy predictions for one or more future time steps. Each occupancy prediction can identify predicted locations of obstacles in the environment of the robot at a different one of the future time steps. A planned trajectory can be determined for the robot using the occupancy predictions, and the robot initiates travel along the planned trajectory.Type: GrantFiled: May 20, 2021Date of Patent: May 16, 2023Assignee: X Development LLCInventor: David Millard
-
Patent number: 11642787Abstract: A method for generating a trajectory of a robot from a first configuration to a second configuration within an environment while steering away from obstacles may include obtaining physical workspace information associated with the environment in which the robot is configured to operate; obtaining, using a first neural network, a set of weights of a second neural network that is configured to generate a set of values associated with a set of configurations of the robot with respect to the second configuration; obtaining, by applying the set of weights to the second neural network, the set of values associated with the set of configurations of the robot with respect to the second configuration; and generating the trajectory of the robot from the first configuration to the second configuration within the environment, based on the set of values.Type: GrantFiled: December 10, 2020Date of Patent: May 9, 2023Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Jinwook Huh, Galen Kailun Xing, Ziyun Wang, Ibrahim Volkan Isler, Daniel Dongyuel Lee
-
Patent number: 11642798Abstract: Disclosed are a method and a system for charging a robot. A method for charging a robot according to an embodiment of the present disclosure includes monitoring a battery level of a first robot which is providing a service, determining a charging robot for charging the first robot, from a plurality of second robots, when a battery level of the first robot falls below a first threshold level, and transmitting an instruction to move to a target position to the determined charging robot, in which determining the charging robot comprises determining the charging robot based at least partly on distances between the first robot and the second robots and battery levels of the second robots. Embodiments of the present disclosure may be implemented by executing an artificial intelligence algorithm and/or a machine learning algorithm in a 5G environment connected for Internet of Things.Type: GrantFiled: January 15, 2020Date of Patent: May 9, 2023Assignee: LG ELECTRONICS INC.Inventors: Jae Ho Kwak, Won Hong Jeong
-
Patent number: 11633855Abstract: To provide a robot controller and a robot control method that do not need a logic command to be associated with a teaching position for a robot, and that are thus capable of executing the logic command at a desired position and a desired timing. A robot controller includes: an operation command interpretation unit that interprets an operation command program describing a teaching operation and a teaching position for a robot, and that generates an operation command; a logic command interpretation unit that interprets a logic command program describing a logic command instructing a machining process to be performed by the robot and an execution position for the logic command, independently from the teaching operation and the teaching position, and that generates the logic command that includes the execution position; and a command execution unit that executes the operation command and the logic command.Type: GrantFiled: September 15, 2020Date of Patent: April 25, 2023Assignee: FANUC CORPORATIONInventor: Takahiro Tanaka
-
Patent number: 11629962Abstract: Methods and systems for improved positioning accuracy relative to a digital map are disclosed, and which are preferably used for highly and fully automated driving applications, and which may use localisation reference data associated with a digital map. The invention further extends to methods and systems for the generation of localisation reference data associated with a digital map.Type: GrantFiled: March 21, 2022Date of Patent: April 18, 2023Assignee: TomTom Global Content B.V.Inventors: Krzysztof Kudrynski, Krzysztof Miksa, Rafal Jan Gliszczynski, Blazej Kubiak