Patents Examined by Jonathan L Sample
-
Patent number: 11911917Abstract: A mobile robot including a vision system, the vision system including a camera and an illumination system; the illumination system including a plurality of light sources arranged to provide a level of illumination to an area surrounding the mobile robot; and a control system for controlling the illumination system. The control system adjusts the level of illumination provided by the plurality of light sources based on an image captured by the camera; an exposure time of the camera at the time the image was captured; and robot rotation information.Type: GrantFiled: May 8, 2019Date of Patent: February 27, 2024Assignee: Dyson Technology LimitedInventors: David Finlay Wyatt, David Andrew Richards, Hossein Farid Ghassem Nia, Christopher Andrew Smith
-
Patent number: 11904475Abstract: A robot and an operating system, a control device, a control method and a storage medium thereof, wherein the robot includes a processor configured to execute the following operation commands: controlling the robot to move to a designated position corresponding to an interaction scenario, wherein the interaction scenario is a scenario in which the robot interacts with a user; controlling the robot to perform an operation corresponding to an operation of the user in the interaction scenario, thereby realizing the robot can perform a complex interaction with the user based on the colorful interaction scenario and solving the technical problem of the prior art that the interaction scenario and the interactive content between the user and the robot are excessively monotonous.Type: GrantFiled: March 6, 2019Date of Patent: February 20, 2024Assignee: BEIJING MAGIC PAL TECHNOLOGY CO., LTD.Inventor: Peng Liu
-
Patent number: 11898853Abstract: The subject disclosure relates to ways to determine a data collection surveillance cadence. In some aspects, a process of the disclosed technology includes steps for receiving historic map data for one or more geographic regions, wherein each of the one or more geographic regions comprises one or more map features, calculating a change rate for each of the one or more map features, and determining a surveillance cadence for each of the one or more geographic regions based on the change rate for each of the one or more map features. Systems and machine-readable media are also provided.Type: GrantFiled: March 31, 2020Date of Patent: February 13, 2024Assignee: GM Cruise Holdings LLCInventors: Chen Xie, Matthew Fox, Austin Bae, Brian Joseph Donohue
-
Patent number: 11890757Abstract: The present disclosure describes a device, computer-readable medium, and method for providing logistical support for robots. In one example, the method includes receiving, at a centralized support center that is in communication with a plurality of robots, a query from a first robot of the plurality of robots that has been deployed to perform a task, wherein the query indicates an error encountered by the first robot and evidence of the error collected by the first robot, formulating, at the centralized support center, a proposed solution to resolve the error, wherein the formulating comprises soliciting an analysis of the evidence by a party other than the first robot, and delivering, by the centralized support center, the proposed solution to the first robot.Type: GrantFiled: March 22, 2021Date of Patent: February 6, 2024Assignees: HYUNDAI MOTOR COMPANY, KIA CORPORATIONInventors: Eric Zavesky, David Crawford Gibbon, Bernard S. Renger, Tan Xu
-
Patent number: 11883122Abstract: Methods and devices are provided for robotic surgery, and in particular for controlling various motions of a tool based on visual indicators. In general, a surgical tool can include an elongate shaft and an end effector coupled to a distal end of the elongate shaft and including first and second jaws. The tool can have at least one visual indicator disposed thereon and configured to indicate a size, position, or speed of movement of the tool or components of the tool.Type: GrantFiled: June 7, 2021Date of Patent: January 30, 2024Assignee: Cilag GmbH InternationalInventors: David C. Yates, Jason L. Harris, Frederick E. Shelton, IV, Chad E. Eckert
-
Patent number: 11883966Abstract: A method and computing system for performing object detection are presented. The computing system may be configured to: receive first image information that represents at least a first portion of an object structure of an object in a camera's field of view, wherein the first image information is associate with a first camera pose; generate or update, based on the first image information, sensed structure information representing the object structure; identify an object corner associated with the object structure; cause the robot arm to move the camera to a second camera pose in which the camera is pointed at the object corner; receive second image information associated with the second camera pose; update the sensed structure information based on the second image information; determine, based on the updated sensed structure information, an object type associated with the object; determine one or more robot interaction locations based on the object type.Type: GrantFiled: December 9, 2020Date of Patent: January 30, 2024Assignee: MUJIN, INC.Inventors: Xutao Ye, Puttichai Lertkultanon, Rosen Nikolaev Diankov
-
Patent number: 11880293Abstract: Techniques for capturing and recording processor events and scheduler data in a production system on a per processing resource basis are discussed herein. In some examples, a process metric collection component may be associated with the scheduler and the processing resource such that the process metric collection component can capture real time data associated with the processes or threads both executed by the processing resource and waiting to execute on the processing resource. The captured data may be used by the system to monitor operations.Type: GrantFiled: November 26, 2019Date of Patent: January 23, 2024Assignee: Zoox, Inc.Inventors: Austin Hendrix, Andrew Lewis King, Thomas Michael Flanagan
-
Patent number: 11878418Abstract: A method for controlling at least one effector trajectory of an effector of a robot for solving a predefined task is proposed. A sequence of postures are acquired to modify at least one of a contact constraint topology and an object constraint topology. A set of constraint equations are generated based on at least one of the modified contact constraint topology and the modified object constraint topology. On the generated set of constraint equations, a constraint relaxation is performed to generate a task description including a set of relaxed constraint equations. The at least one effector trajectory is generated by applying a trajectory generation algorithm on the task description. An inverse kinematics algorithm is performed to generate a control signal from the at least one effector trajectory. At least one effector is controlled to execute the at least one effector trajectory based on the generated control signal.Type: GrantFiled: March 30, 2021Date of Patent: January 23, 2024Assignee: Honda Research Institute Europe GmbHInventor: Michael Gienger
-
Patent number: 11878415Abstract: Systems and methods relating to tactile dexterity and control are disclosed. In one embodiment, a method of manipulating an object based on tactile sensing includes sensing an object by receiving signals from a tactile sensor of an end effector of a robotic system in contact with the object, controlling a contact state by operating the end effector to enforce a desired contact condition between the end effector and the object, estimating a pose of the object based on the received signals, and planning at least one trajectory of the object based on the estimated pose of the object and a desired pose of the object.Type: GrantFiled: November 13, 2020Date of Patent: January 23, 2024Assignee: Massachusetts Institute of TechnologyInventors: Alberto Rodriguez Garcia, Francois Robert Hogan
-
Patent number: 11874133Abstract: Certain examples described herein enable a robotic device to accurate map a surrounding environment. The robotic device uses an image capture device and at least one of the image capture device and the robotic device move within the environment. Measurements associated with movement of at least one of the image capture device and the robotic device are used to determine a state of the robotic device. The state of the robotic device models the image capture device and the robotic device with respect to a model of the environment that is constructed by a mapping engine. By comparing the state of the robotic device with a measured change in the robotic device, an accurate representation of the state of the robotic device may be constructed. This state is used by the mapping engine to update the model of the environment.Type: GrantFiled: July 23, 2021Date of Patent: January 16, 2024Assignee: Imperial College Innovations LimitedInventors: Charles Fletcher Houseago, Michael Bloesch, Stefan Leutenegger
-
Patent number: 11864483Abstract: One or more information maps are obtained by an agricultural work machine. The one or more information maps map one or more agricultural characteristic values at different geographic locations of a field. An in-situ sensor on the agricultural work machine senses an agricultural characteristic as the agricultural work machine moves through the field. A predictive map generator generates a predictive map that predicts a predictive agricultural characteristic at different locations in the field based on a relationship between the values in the one or more information maps and the agricultural characteristic sensed by the in-situ sensor. The predictive map can be output and used in automated machine control.Type: GrantFiled: October 9, 2020Date of Patent: January 9, 2024Assignee: Deere & CompanyInventors: Nathan R. Vandike, Bhanu Kiran Reddy Palla, Noel W. Anderson, Stephen R. Corban
-
Patent number: 11858140Abstract: A robot system includes a robot, state detection sensors to, a timekeeping unit, a learning control unit, a determination unit, an operation device, and an input unit, and an additional learning unit. The determination unit determines whether or not the work of the robot can be continued under the control of the learning control unit based on the state values detected by the state detection sensors to and outputs determination result. The additional learning unit performs additional learning of the determination result indicating that the work of the robot cannot be continued, the operator operation force, work state output by the operation device and the input unit, and timer signal output by the timekeeping unit.Type: GrantFiled: May 24, 2019Date of Patent: January 2, 2024Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHAInventors: Hitoshi Hasunuma, Takuya Shitaka, Takeshi Yamamoto, Kazuki Kurashima
-
Patent number: 11858147Abstract: A method of detecting a substrate hold hand optical axis deviation includes: acquiring a hand reference turning position at which an ideal optical axis extends in a horizontal first direction; performing a first search processing; performing a second search processing similar to the first search processing to a second target body; and detecting an optical axis inclination from the ideal optical axis based on a difference between the positions detected in the first search processing and the second search processing. A distance in the second direction from the turning axis to the first target body is equal to a distance in the second direction from the turning axis to the second target body. On the hand, an intersecting position between the optical axis and the first target body is different from an intersecting position between the optical axis and the second target body.Type: GrantFiled: May 10, 2019Date of Patent: January 2, 2024Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHAInventors: Hiroyuki Okada, Masaya Yoshida
-
Patent number: 11850749Abstract: The present invention provides a modular robot. The module robot includes at least two unit modules, each unit module includes at least one subunit module, and the subunit module includes at least two connected submodules; the two submodules can be controlled by electrical signals to rotate relatively, to change the modular robot configuration, and every unit module provides at least a docking part, the unit modules are connected through the docking part; different docking parts of every unit module have interface identification information, interface identification information of every docking part can be recognized, and the position information of the unit module is obtained by recognizing interface identification information of docking parts of connected unit modules. The present invention further provides the modular robot system and modular robot control method. The modular robot has advantages of simple recognition of position information and high degree of intelligence.Type: GrantFiled: December 17, 2018Date of Patent: December 26, 2023Assignee: BEIJING KEYI TECHNOLOGY CO., LTD.Inventor: Jianbo Yang
-
Patent number: 11845188Abstract: Provided are a method of controlling a mobile robot, apparatus for supporting the method, and delivery system using the mobile robot. The method, which is performed by a control apparatus, comprises acquiring a first control value for the mobile robot, which is input through a remote control apparatus, acquiring a second control value for the mobile robot, which is generated by an autonomous driving module, determining a weight for each control value based on a delay between the mobile robot and the remote control apparatus and generating a target control value of the mobile robot in combination of the first control value and the second control value based on the determined weights, wherein a first weight for the first control value and a second weight for the second control value are inversely proportional to each other.Type: GrantFiled: October 29, 2021Date of Patent: December 19, 2023Assignee: dogugonggan Co., Ltd.Inventor: Jin Hyo Kim
-
Patent number: 11839983Abstract: The present disclosure generally relates to the control of robotic end-effectors in order to manipulate objects. An exemplary method includes updating a classifier based on sensor data obtained at a first time and applying the updated classifier to second sensor data obtained at a second time, to assess status of a robotic end-effector with respect to one or more objects. The method further includes determining a robotic action based on the status assessed and causing a robotic device including the robotic end-effector to perform the robotic action.Type: GrantFiled: November 26, 2019Date of Patent: December 12, 2023Assignee: Ocado Innovation LimitedInventors: Ryan John Dick, James Sterling Bergstra, Lavi Shpigelman
-
Patent number: 11833691Abstract: In one embodiment, a method includes accessing a trajectory plan for a task to be executed by a robotic system, determining actions to constrain the trajectory plan based on information associated with an environment associated with the robotic system, wherein pose-based waypoints and joint positions of the robotic system would be constrained by the actions, determining joint-based waypoints for the trajectory plan based on the pose-based waypoints, and executing the task based on the joint-based waypoints for the trajectory plan.Type: GrantFiled: March 30, 2021Date of Patent: December 5, 2023Assignee: Samsung Electronics Co., Ltd.Inventors: Kathleen Sofia Hajash, Brian Harms, Philipp Schoessler, Dane Mason
-
Patent number: 11834108Abstract: A method for generating torque assist includes determining a model yaw rate value based on a vehicle speed, a steering angle, and a road-friction coefficient value and determining a differential yaw rate value using a difference between the model yaw rate value and a vehicle yaw rate value. The method also includes determining an updated road-friction coefficient value using at least the differential yaw rate value and generating a torque assist value based on the updated road-friction coefficient value and a model rack force value.Type: GrantFiled: November 30, 2020Date of Patent: December 5, 2023Assignee: Steering Solutions IP Holding CorporationInventors: Rangarajan Ramanujam, Mariam S. George, Anthony J. Champagne
-
Patent number: 11833682Abstract: A robot including a manipulator includes: an image-pickup acquisition unit configured to acquire an image of an environmental space including a target object to be grasped; and a control unit configured to control a motion performed by the robot, in which the control unit causes the robot to acquire, by the image-pickup acquisition unit, a plurality of information pieces of the target object to be grasped while it moves the robot so that the robot approaches the target object to be grasped, calculates, for each of the information pieces, a grasping position of the target object to be grasped and an index of certainty of the grasping position by using a learned model, and attempts to grasp, by moving the manipulator, the target object to be grasped at a grasping position selected based on a result of the calculation.Type: GrantFiled: December 11, 2019Date of Patent: December 5, 2023Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHAInventors: Takuya Ikeda, Koji Terada
-
Patent number: 11835961Abstract: Provided is a robot, including: a main brush; a peripheral brush; a first actuator; a first sensor; one or more processors; and memory storing instructions that when executed by at least some of the one or more processors effectuate operations including: determining a first location of the robot; obtaining first data indicative of an environmental characteristic of the first location; adjusting a first operational parameter of the first actuator based on the sensed first data; and forming or updating a debris map of the working environment based on data output by the first sensor or another sensor configured to collect data indicative of an existence of debris on a floor, wherein the debris map at least indicates areas covered by the robot and with a high level of debris accumulation; and an application of a communication device paired with the robot and configured to at least display the debris map.Type: GrantFiled: August 27, 2020Date of Patent: December 5, 2023Assignee: Al IncorporatedInventors: Ali Ebrahimi Afrouzi, Scott McDonald, Masoud Nasiri Sarvi