Programmed Data (e.g., Path) Modified By Sensed Data Patents (Class 700/253)
  • Patent number: 12251048
    Abstract: A system is disclosed for automatically preparing meals according to a selected recipe.
    Type: Grant
    Filed: August 2, 2018
    Date of Patent: March 18, 2025
    Assignee: CHEF JASPER INC.
    Inventor: Gunnar Grass
  • Patent number: 12240113
    Abstract: Implementations utilize deep reinforcement learning to train a policy neural network that parameterizes a policy for determining a robotic action based on a current state. Some of those implementations collect experience data from multiple robots that operate simultaneously. Each robot generates instances of experience data during iterative performance of episodes that are each explorations of performing a task, and that are each guided based on the policy network and the current policy parameters for the policy network during the episode. The collected experience data is generated during the episodes and is used to train the policy network by iteratively updating policy parameters of the policy network based on a batch of collected experience data. Further, prior to performance of each of a plurality of episodes performed by the robots, the current updated policy parameters can be provided (or retrieved) for utilization in performance of the episode.
    Type: Grant
    Filed: December 1, 2023
    Date of Patent: March 4, 2025
    Assignee: GOOGLE LLC
    Inventors: Sergey Levine, Ethan Holly, Shixiang Gu, Timothy Lillicrap
  • Patent number: 12203773
    Abstract: Autonomous ground vehicles that are outfitted with onboard cameras and configured to travel within an area or region are programmed with a previously generated visual map of the area or region. The visual map includes a plurality of map points from which images were previously captured and processed to determine visual features of such images. In the event of a loss of one or more position signals, such as GPS signals, the autonomous ground vehicles capture images using one or more of the cameras, and process the images to determine visual features from such images. The visual features determined from such images are compared to the visual map. Positions of one or more of the map points are identified as most likely corresponding to a position of the autonomous ground vehicle where visual features associated with such map points are similar to the visual features determined from such images.
    Type: Grant
    Filed: June 29, 2022
    Date of Patent: January 21, 2025
    Assignee: Amazon Technologies, Inc.
    Inventors: You-Yi Jau, Aziz Umit Batur, Chen Zhang
  • Patent number: 12193429
    Abstract: An autonomous ground vehicle for agricultural plant and soil management operations. According to some embodiments, autonomous ground vehicle includes: a camera unit configured to generate images of agricultural ground soil and plant organisms, a first mechanical arm having an end effector comprising a hoe portion and an electrode portion, a second mechanical arm having an end effector comprising an electrode portion, a high voltage booster electrically connected to the electrode portions, an electronic memory storage medium comprising computer-executable instructions; one or more processors in electronic communication with the electronic memory storage medium, configured to execute the computer-executable instructions stored in an electronic memory storage medium for implementing a plant species control management operation comprising electrical control and mechanical control options.
    Type: Grant
    Filed: July 14, 2023
    Date of Patent: January 14, 2025
    Assignee: AIGEN INC.
    Inventor: Richard Theodore Wurden
  • Patent number: 12186134
    Abstract: An instrument drive unit includes a housing defining a central longitudinal axis; an inertial measurement unit disposed within the housing and configured to determine a pose of the instrument drive unit; and a controller disposed within the housing, the controller configured to receive the pose of the instrument drive unit from the inertial measurement unit and to generate a corrected output signal which compensates for the pose of the instrument drive unit.
    Type: Grant
    Filed: March 10, 2020
    Date of Patent: January 7, 2025
    Assignee: Covidien LP
    Inventors: Jaimeen V. Kapadia, Richard S. Lech
  • Patent number: 12172667
    Abstract: In various examples, a 3D surface structure such as the 3D surface structure of a road (3D road surface) may be observed and estimated to generate a 3D point cloud or other representation of the 3D surface structure. Since the estimated representation may be sparse, a deep neural network (DNN) may be used to predict values for a dense representation of the 3D surface structure from the sparse representation. For example, a sparse 3D point cloud may be projected to form a sparse projection image (e.g., a sparse 2D height map), which may be fed into the DNN to predict a dense projection image (e.g., a dense 2D height map). The predicted dense representation of the 3D surface structure may be provided to an autonomous vehicle drive stack to enable safe and comfortable planning and control of the autonomous vehicle.
    Type: Grant
    Filed: October 28, 2021
    Date of Patent: December 24, 2024
    Assignee: NVIDIA Corporation
    Inventors: Kang Wang, Yue Wu, Minwoo Park, Gang Pan
  • Patent number: 12172309
    Abstract: Training and/or using a machine learning model for locomotion control of a robot, where the model is decoupled. In many implementations, the model is decoupled into an open loop component and a feedback component, where a user can provide a desired reference trajectory (e.g., a symmetric sine curve) as input for the open loop component. In additional and/or alternative implementations, the model is decoupled into a pattern generator component and a feedback component, where a user can provide controlled parameter(s) as input for the pattern generator component to generate pattern generator phase data (e.g., an asymmetric sine curve). The neural network model can be used to generate robot control parameters.
    Type: Grant
    Filed: April 22, 2019
    Date of Patent: December 24, 2024
    Assignee: GOOGLE LLC
    Inventors: Jie Tan, Tingnan Zhang, Atil Iscen, Erwin Coumans, Yunfei Bai
  • Patent number: 12147242
    Abstract: Systems and methods are provided for crowdsourcing a sparse map for autonomous vehicle navigation. In one implementation, a non-transitory computer-readable medium may include a sparse map for autonomous vehicle navigation along a road segment. The sparse map may include at least one line representation of a road surface feature extending along the road segment, each line representation representing a path along the road segment substantially corresponding with the road surface feature, and wherein the road surface feature is identified through image analysis of a plurality of images acquired as one or more vehicles traverse the road segment and a plurality of landmarks associated with the road segment.
    Type: Grant
    Filed: June 16, 2021
    Date of Patent: November 19, 2024
    Assignee: MOBILEYE VISION TECHNOLOGIES LTD.
    Inventor: Ofer Fridman
  • Patent number: 12138800
    Abstract: A program generation device includes a display; at least one memory configured to store an operation symbol including information in relation to an operation command of a robot, and an auxiliary symbol including information in relation to a control command for adding an operation of the robot or for correcting the operation of the robot defined by at least one operation symbol; and at least one processor configured to obtain information in relation to setting of at least one of the operation symbol or the auxiliary symbol, and cause the display to display the operation symbol and the auxiliary symbol so as to align the operation symbol and the auxiliary symbol in order of operations of the robot based on the obtained information in relation to setting.
    Type: Grant
    Filed: June 26, 2023
    Date of Patent: November 12, 2024
    Assignee: FANUC CORPORATION
    Inventors: Yuusuke Kurihara, Gou Inaba
  • Patent number: 12099371
    Abstract: Embodiments of the disclosure provide methods and systems for continuous regulation of a nonholonomic mobile robot. An exemplary method may include identifying a current pose of the nonholonomic mobile robot in a world frame, where the current pose is represented by a first set of values defining a first set of states of the nonholonomic mobile robot in the world frame; receiving a final goal pose of the nonholonomic mobile robot, where the final goal pose is represented by a second set of values defining a second set of states of nonholonomic mobile robot in the world frame; determining a moving path for moving the nonholonomic mobile robot from the current pose to the final goal pose; and controlling the nonholonomic mobile robot to move from the current pose to the final goal pose according to the moving path, where the nonholonomic mobile robot moves to the final goal pose by converging the nonholonomic mobile robot from the first set of states to the second set of states simultaneously.
    Type: Grant
    Filed: June 4, 2021
    Date of Patent: September 24, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: Dejun Guo, Huan Tan
  • Patent number: 12091981
    Abstract: An insertion tool is provided for an engine defining an access opening and including a component defining at least in part a cavity. The insertion tool includes: an insertion tool arm having a plurality of segments, the insertion tool arm configured for insertion through the access opening into the cavity and the plurality of segments configured to be in a fixed position relative to one another within the cavity; and a base coupled to the insertion tool arm and configured to be positioned outside the cavity and to move the insertion tool arm along at least two degrees of freedom.
    Type: Grant
    Filed: June 11, 2020
    Date of Patent: September 17, 2024
    Assignee: General Electric Company
    Inventors: Julian Matthew Foxall, Trevor Owen Hawke, Andrew Crispin Graham, Chiara Mellucci, Todd William Danko, Ambarish Jayant Kulkarni, Michael Dean Fullington, Margeaux Wallace
  • Patent number: 12087102
    Abstract: Systems and techniques for detecting and locating low impact collisions, such as relatively low energy impact collisions involving an autonomous vehicle, are disclosed herein. Sensor data from audio sensors configured on a vehicle may be used to determine whether a threshold number of such sensors have detected sounds associated with a collision. Scores determined for the sensors based on audio energy and confidence values can be used to determine a location estimate for the detected collision.
    Type: Grant
    Filed: December 17, 2021
    Date of Patent: September 10, 2024
    Assignee: Zoox, Inc.
    Inventors: Mark Alan Bates, Venkata Subrahrnanyarn Chandra Sekhar Chebiyyam, Nam Gook Cho, Subhasis Das, Shaminda Subasingha, Xuan Zhong
  • Patent number: 12078972
    Abstract: A probabilistic feedback controller for controlling an operation of a robotic system using a probabilistic filter subject to a structural constraint on an operation of the robotic system is configured to execute a probabilistic filter estimates a distribution of a current state of the robotic system given a previous state of the robotic system based on a motion model of the robotic system perturbed by stochastic process noise and a measurement model of the robotic system perturbed by stochastic measurement noise having an uncertainty modeled as a time-varying Gaussian process represented as a weighted combination of time-varying basis functions with weights defined by corresponding Gaussian distributions. The probabilistic filter recursively updates both the distribution of the current state of the robotic system and the Gaussian distributions of the weights of the basis functions selected to satisfy the structural constraint indicated by measurements of the state of a robotic system.
    Type: Grant
    Filed: March 12, 2022
    Date of Patent: September 3, 2024
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Karl Berntorp, Marcel Menner
  • Patent number: 12061484
    Abstract: A system including an inspection robot having a plurality of sensors, a further sensor, and a controller. The controller having circuitry to receive inspection data with a first resolution from the plurality of sensors, determine a characteristic on the inspection surface based on the inspection data, and provide an inspection operation adjustment in response to the characteristic, wherein the inspection operation adjustment includes a change from the first resolution to a second resolution. The change from the first resolution to the second resolution includes enabling the further sensor where the further sensor is at least one of: horizontally distributed with or vertically displaced from the plurality of sensors relative to a travel path of the plurality of sensors, and at least one of: offset in alignment from the travel path of the plurality of sensors, or operated out of phase with the plurality of sensors.
    Type: Grant
    Filed: April 19, 2023
    Date of Patent: August 13, 2024
    Assignee: Gecko Robotics, Inc.
    Inventors: Mark Loosararian, Joshua Moore, Yizhu Gu, Kevin Low, Edward Bryner, Logan MacKenzie, Ian Miller, Alvin Chou, Todd Joslin
  • Patent number: 12030188
    Abstract: The present invention provides a collaborative robot system in which a robot and a worker share tasks and perform the tasks. The collaborative robot system including: a work-time-measurement-unit that measures work time for locations assigned to the worker and the robot; a difference-at-increase/decrease-prediction-unit that predicts a difference between the work time of the worker and the robot, when the number of locations assigned to the worker is increased or decreased; an assigned-location-adjustment-unit that increases or decreases the number of locations so that the difference in the work time becomes smaller, in a case in which a predicted-difference-value based on the work time of the worker and the robot, when the number of locations assigned to the worker is maintained is greater than the predicted-difference-value; and an assigned-location-indication-unit that indicates the locations assigned to the worker after the number of assigned locations is increased or decreased.
    Type: Grant
    Filed: April 9, 2020
    Date of Patent: July 9, 2024
    Assignee: FANUC CORPORATION
    Inventor: Ichiro Kanno
  • Patent number: 12032381
    Abstract: An accompanying mobile body in which a response to an abrupt change in a traveling direction of a user is restrained from being delayed. A movement state recognition section recognizes a moving direction and a moving speed of a user. A predicted location calculation section calculates a predicted location of the user after a first predetermined time period based on the moving direction and the moving speed of the user. An accompanying control section performs accompanying control to cause an accompanying mobile body to move by a traveling unit toward a target location of accompanying that is a location apart from the predicted location by a specified distance in a specified direction.
    Type: Grant
    Filed: January 22, 2019
    Date of Patent: July 9, 2024
    Assignee: HONDA MOTOR CO., LTD.
    Inventors: Toru Kawai, Hiroki Mukai, Hiroto Takahashi
  • Patent number: 12025710
    Abstract: A machine vision device is provided usable in conjunction with power equipment machines. By way of example, an array of sensors can be deployed to detect proximity of objects to the power equipment machine, and issue an alert in response to detecting an object within a threshold distance. The alert can be utilized by the power equipment machine to take corrective action to mitigate or avoid running over or striking the object. Sensors having respective fields of view can be arranged along an arc to facilitate machine vision of a spatial volume in a proximity of the machine, and ranging determinations can be coupled with low cost processing devices to facilitate a machine vision solution far more cost effective than other technologies in the art.
    Type: Grant
    Filed: November 10, 2021
    Date of Patent: July 2, 2024
    Assignee: MTD PRODUCTS INC
    Inventors: Charles Hart, Jeff Kucera
  • Patent number: 12023814
    Abstract: The present technology relates to a mobile object and a control method that enable improvement of work efficiency. The mobile object controls a drive system that performs maintenance work in accordance with a maintenance plan generated in response to a work cost related to main work according to a work plan, or related to the maintenance work for maintenance of a work mobile object that performs the main work. The present technology can be applied to, for example, a maintenance mobile object that performs maintenance work for maintenance of a work mobile object that performs main work.
    Type: Grant
    Filed: July 12, 2019
    Date of Patent: July 2, 2024
    Assignee: SONY CORPORATION
    Inventor: Norifumi Kikkawa
  • Patent number: 12016519
    Abstract: Disclosed are a robot cleaner and a method for controlling same. The robot cleaner comprises: a travel unit for moving a main body; a communication unit for communicating with a remote control device by using ultra-wideband signals; and a control unit which, in response to a first optical signal emitted from the remote control device to the main body, calculates the position of the remote control device by using the ultra-wideband signals that are output to the communication unit. Also, the control unit recognizes, in response to a second optical signal emitted from the remote control device directly after the first optical signal, the position of a target point corresponding to the second optical signal and calculated on the basis of the determined position of the remote control device, and generates a travel command to move the main body to the recognized position of the target point.
    Type: Grant
    Filed: July 22, 2019
    Date of Patent: June 25, 2024
    Assignee: LG ELECTRONICS INC.
    Inventors: Donghoon Kwak, Jaehwan Ko, Hyukdo Kweon
  • Patent number: 11951625
    Abstract: A control method for a robot includes a first working step of executing first work on a first working object by operating a robot arm by force control based on a predetermined position command value, a first memory step of storing first position information of a trajectory in which a control point set for the robot arm passes at the first working step, and a second working step of updating a position command value for the robot arm based on the first position information stored at the first memory step, and executing second work on a second working object by operating the robot arm by the force control based on an updated value as the updated position command value.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: April 9, 2024
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Nobuhiro Karito
  • Patent number: 11953888
    Abstract: A production cell includes: at least one robot arranged to handle products; at least one buffer area for intermediate storage of products inside the production cell; a vision system with cameras arranged to determine, based on images from the cameras, the identity and the location of objects in the production cell a plurality of production modules, each production module comprising at least one Hardware Module configured to process products; and a plurality of module attachment locations, each module attachment location being configured to connect with an interface section of a production module through at least a physical connection and a power connection.
    Type: Grant
    Filed: October 25, 2018
    Date of Patent: April 9, 2024
    Assignee: Festo SE & Co. KG
    Inventors: Alfons Riek, Curt-Michael Stoll, Hans Klingel, Marcel Aeschlimann, Samuel Malzach, Christian Schmid, Christoph Berger, Judith Wimmer, Ivo Aschwanden, Kilian Iannucci, Alexandra Krause, Markus Andreas Müller, Martin Helmer, Peter Barmettler
  • Patent number: 11934196
    Abstract: A lawn vehicle network includes a charging station having a visual identifier, a lawn vehicle having a battery, a blade system, a drive system whose output effects lawn vehicle forward movement, a processor board connected to both systems, the processor board capable of processing image data and sending commands to both systems, and a vision assembly connected to the processor board and able to transmit image data to the processor board, and the processor board, having received the image data, able to, if the image data represent a first object, maintain the drive system's output at the time of that determination, if the image data represent a second object, change the drive system's output at the time of that determination, and if the image data represent the visual identifier, maintain the drive system's output or send a shutoff command to the vision assembly at the time of that determination.
    Type: Grant
    Filed: December 13, 2019
    Date of Patent: March 19, 2024
    Assignee: Hydro-Gear Limited Partnership
    Inventors: Damon J. Hoyda, Eric S. Phanco, John Tyler Hibbard, David H. Dunten
  • Patent number: 11906966
    Abstract: A robot detects, through a sensor, the location and movement direction of a user and an object near the user, sets a nearby ground area in front at the feet of the user according to the detected location and movement direction of the user, controls an illumination device in the robot to irradiate the nearby ground area with light while driving at least one pair of legs or wheels of the robot to cause the robot to accompany the user, specifies the type and the location of the detected object, and if the object is a dangerous object and is located ahead of the user, controls the illumination device to irradiate a danger area including at least a portion of the dangerous object with light in addition to irradiating the nearby ground area with light.
    Type: Grant
    Filed: October 21, 2022
    Date of Patent: February 20, 2024
    Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.
    Inventor: Hiroshi Yahata
  • Patent number: 11886190
    Abstract: A robot detects, through a sensor, the location and movement direction of a user and an object near the user, sets a nearby ground area in front at the feet of the user according to the detected location and movement direction of the user, controls an illumination device in the robot to irradiate the nearby ground area with light while driving at least one pair of legs or wheels of the robot to cause the robot to accompany the user, specifies the type and the location of the detected object, and if the object is a dangerous object and is located ahead of the user, controls the illumination device to irradiate a danger area including at least a portion of the dangerous object with light in addition to irradiating the nearby ground area with light.
    Type: Grant
    Filed: February 3, 2023
    Date of Patent: January 30, 2024
    Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.
    Inventor: Hiroshi Yahata
  • Patent number: 11858141
    Abstract: An impedance control method as well as a controller and a robot using the same are provided. The method includes: obtaining joint motion information and joint force information in the joint space of a robotic arm and an actual interaction force acting on an end-effector, and calculating actual motion information of the end-effector in the task space based on the joint motion information; calculating a corrected desired trajectory using environment information and a desired end-effector interaction force, and calculating the impedance control torque based on the joint force information, the actual interaction force, the actual motion information, and desired end-effector information including the corrected desired trajectory and determining a compensation torque based on a nonlinear term in a constructed dynamics equation so as to perform a joint torque control on the robotic arm based on the impedance control torque and the compensation torque.
    Type: Grant
    Filed: August 23, 2021
    Date of Patent: January 2, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Liqun Huang, Xiaoyu Ren, Mingguo Zhao, Youjun Xiong
  • Patent number: 11825342
    Abstract: Systems, apparatuses, and methods for reducing network bandwidth usage by a fleet of robots. According to at least one non-limiting exemplary embodiment, robots coupled to a server collect and produce a substantial amount of data, only a portion of that data being useful for operators to monitor behavior of the robot. The present disclosure provides for, inter alia, optimized systems, apparatuses, and methods for operators to extract the useful data using only reduced bandwidth of cellular LTE networks or Wi-Fi networks.
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: November 21, 2023
    Assignee: Brain Corporation
    Inventors: Keith Chester, Daniel Sackinger
  • Patent number: 11801937
    Abstract: Systems and methods for autonomously herding birds in accordance with embodiments of the invention are illustrated. One embodiment includes an autonomous flock herding system, including a bird location sensor, a drone; and a control system, including a processor, and a memory, the memory containing a flock herding application, where the application directs the processor to obtain bird position data from the at least one bird location sensor, where the bird position data describes the location of birds in a flock of birds, determine if the flock of birds will enter a protected zone, generate a set of waypoints using a flock dynamics model, instruct the unmanned aerial vehicle to navigate to at least one waypoint in the set of waypoints such that the flock of birds will, in response to the presence of the unmanned aerial vehicle at the at least one waypoint, change trajectory away from the protected zone.
    Type: Grant
    Filed: July 26, 2019
    Date of Patent: October 31, 2023
    Assignees: California Institute of Technology, Imperial College Innovations Limited
    Inventors: Soon-Jo Chung, Aditya Paranjape, Kyunam Kim
  • Patent number: 11789455
    Abstract: Embodiments of the present application disclose a positioning method and apparatus, an autonomous driving vehicle, an electronic device and a storage medium, relating to the field of autonomous driving technologies, comprising: collecting first pose information measured by an inertial measurement unit within a preset time period, and collecting second pose information measured by a wheel tachometer within the time period; generating positioning information according to the first pose information, the second pose information and the adjacent frame images; controlling driving of the autonomous driving vehicle according to the positioning information. The positioning information is estimated by combining the first pose information and the second pose information corresponding to the inertial measurement unit and the wheel tachometer respectively. Compared with the camera, the inertial measurement unit and the wheel tachometer are not prone to be interfered by the external environment.
    Type: Grant
    Filed: December 29, 2020
    Date of Patent: October 17, 2023
    Assignees: Beijing Baidu Netcom Science Technology Co., Ltd., Apollo Intelligent Driving Technology (Beijing) Co., Ltd.
    Inventors: Wendong Ding, Xiaofei Rui, Gang Wang, Shiyu Song
  • Patent number: 11772265
    Abstract: A method for controlling movement of a robot includes inputting a first linear velocity parameter and a first angular velocity parameter, which are specified as the robot moves, into a first limit model for limiting a centripetal acceleration to correct the first linear velocity parameter and the first angular velocity parameter, thereby calculating a second linear velocity parameter and a second angular velocity parameter, inputting the second linear velocity parameter and the second angular velocity parameter into at least one of a second limit model for limiting a linear velocity and a third limit model for limiting an angular velocity to correct the second linear velocity parameter and the second angular velocity parameter, thereby calculating a third linear velocity parameter and a third angular velocity parameter, and controlling the movement of the robot based on the third linear velocity parameter and the third angular velocity parameter.
    Type: Grant
    Filed: August 20, 2020
    Date of Patent: October 3, 2023
    Assignee: Bear Robotics, Inc.
    Inventors: Bryant Leo Pong, Henry A. Leinhos, Sanghun Jung
  • Patent number: 11772277
    Abstract: An example robotic tool holder includes an actuator that is disposed within a housing and configured to hold a tool. The housing and the actuator are in contact via dowels to limit movement of the actuator toward a distal end of the housing. Ones of the dowels that are in contact are in line contact and the ones of the dowels that are in contact are in a triangular geometry. The pressure plate is in line contact with the actuator within the housing around a circumference of the pressure plate. The springs are in contact with the pressure plate to bias the actuator toward a proximal end of the housing via the pressure plate. The springs are in contact with the mounting plate opposite the pressure plate. The sensor switch detects a shock force on the actuator and outputs a signal in response to the shock force.
    Type: Grant
    Filed: April 28, 2020
    Date of Patent: October 3, 2023
    Assignee: Illinois Tool Works Inc.
    Inventor: Nauman Basit
  • Patent number: 11747822
    Abstract: A mobile delivery robot has at least one memory component containing at least map data; at least two cameras adapted to take visual images; and at least one processing component. The at least one processing component is adapted to at least extract straight lines from the visual images taken by the at least two cameras and compare them to the map data to at least localize the robot. The mobile robot employs a localization method which involves taking visual images with at least two cameras; extracting straight lines from the individual visual images with at least one processing component; comparing the extracted features with existing map data; and outputting a location hypothesis based on said comparison.
    Type: Grant
    Filed: June 10, 2021
    Date of Patent: September 5, 2023
    Assignee: STARSHIP TECHNOLOGIES OÜ
    Inventors: Ahti Heinla, Kalle-Rasmus Volkov, Lindsay Roberts, Indrek Mandre
  • Patent number: 11720100
    Abstract: Various embodiments include methods for improving navigation by a processor of a robotic device. Such embodiments may include initiating a start of a predetermined time period associated with semantic information extraction, and determining whether an adverse event related to one or more sensors of the robotic device is detected. Such embodiments may also include identifying a current time slot of the predetermined time period, identifying a current estimated position and orientation of the robotic device, and recording updates to semantic information stored for the one or more sensor based on the identified current time slot and the current estimated position and orientation of the robotic device in response to determining that an adverse event related to one or more sensors of the robotic device is detected.
    Type: Grant
    Filed: March 14, 2018
    Date of Patent: August 8, 2023
    Assignee: QUALCOMM Incorporated
    Inventors: Xiaohui Liu, Yibo Jiang, Jiangtao Ren, Lei Xu, Yanming Zou
  • Patent number: 11662735
    Abstract: The invention relates to a method for updating a control model for automatic control of at least one mobile unit. A central control unit generates a detection task and transmits same to the mobile unit. The mobile unit comprises sensors, and the detection task comprises conditions for detecting sensor data sets by means of the sensors. The mobile unit detects the sensor data sets by means of the sensors using the detection task, generates transmission data using the detected sensor data sets, and transmits the transmission data to the central control unit. The central control unit receives the transmission data and generates an updated control model using the received transmission data. The system according to the invention for updating a control model for automatic control of at least one mobile unit comprises a central control unit by means of which a detection task may be generated and transmitted to the mobile unit.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: May 30, 2023
    Assignee: VOLKSWAGEN AKTIENGESELLSCHAFT
    Inventors: Fabian Hüger, Peter Schlicht
  • Patent number: 11654553
    Abstract: A robot system according to an embodiment includes one or more processors. The processors acquire first input data predetermined as data affecting an operation of a robot. The processors calculate a calculation cost of inference processing using a machine learning model for inferring control data used for controlling the robot, on the basis of the first input data. The processors infer the control data by the machine learning model set according to the calculation cost. The processors control the robot using the inferred control data.
    Type: Grant
    Filed: February 25, 2020
    Date of Patent: May 23, 2023
    Assignee: KABUSHIKI KAISHA TOSHIBA
    Inventors: Shuhei Nitta, Atsushi Yaguchi, Yukinobu Sakata, Akiyuki Tanizawa, Yasutoyo Takeyama, Tomoki Watanabe
  • Patent number: 11634887
    Abstract: In a method of controlling construction machinery, a bucket of a working device is moved along a first excavation trajectory to perform an excavation operation on the ground of a work area. A digging force exerted on the bucket during the excavation operation is calculated. A new second excavation trajectory is generated based on the calculated digging force. The bucket is moved along the second excavation trajectory.
    Type: Grant
    Filed: November 19, 2020
    Date of Patent: April 25, 2023
    Assignee: DOOSAN INFRACORE CO., LTD.
    Inventors: Changmook Kim, Dongjun Lee, Changu Kim, Bukun Son
  • Patent number: 11613018
    Abstract: Systems and methods for automatic restocking different items in retail store environments having POS locations for the items are disclosed. The method includes, for a first item of the different items, storing at least one first item parameter uniquely identifying the first item. The method includes determining, based on the at least one first item parameter, a first autonomous movement control scheme for manipulation of the first item by a robotic arm. The method includes executing, by the robotic arm, the first control scheme, the executing including shelving the first item on the shelf. The method includes evaluating, by a processor or a user of the robotic arm, the executing for the first item according to at least one predetermined first performance criteria. The method includes determining and storing, based on the evaluating, an updated first control scheme for subsequent executing by the robotic arm for the first item.
    Type: Grant
    Filed: March 8, 2019
    Date of Patent: March 28, 2023
    Assignee: ABB Schweiz AG
    Inventors: Gregory A. Cole, Gregory F. Rossano, Jordi Artigas, Harald Staab, Thomas A. Fuhlbrigge, Carlos Martinez, Sangeun Choi, Jianjun Wang, Xiongzi Li
  • Patent number: 11592829
    Abstract: A control device and a control method can quickly estimate a self-location even when the self-location is unknown. In a case of storing information supplied in a time series detected by LIDAR or a wheel encoder and estimating a self-location by using the stored time-series information, when a position change happens unpredictably in advance such as a kidnap state is detected, the stored time-series information is reset, and then the self-location is estimated again. Example host platforms include a multi-legged robot, a flying object, and an in-vehicle system that autonomously moves in accordance with a mounted computing machine.
    Type: Grant
    Filed: November 21, 2018
    Date of Patent: February 28, 2023
    Assignee: SONY CORPORATION
    Inventors: Dai Kobayashi, Ryo Watanabe
  • Patent number: 11587027
    Abstract: A method of tracking is described. The method may include receiving, from a location device, a location request for an object and identifying a signal from a transponder associated with the object, wherein the transponder comprises at least one or more types of locating technology or sensing technology. The method may also include determining, by a processing device, at least one of the location of the object or information associated with the object based on the identified signal.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: February 21, 2023
    Assignee: GPS of Things, Inc.
    Inventors: Christopher J. Waters, Brent R. Humphrey
  • Patent number: 11577386
    Abstract: A gripping method relates to a method for gripping an object using a multi-fingered hand provided with a plurality of fingers. The method includes measuring, using a three-dimensional measurement sensor, an area that contains the object, and obtaining three-dimensional information for each position within the area, and deciding positions of the plurality of fingers for gripping the object, by classifying the area, if the area includes a measured area for which the three-dimensional information could be obtained and an unmeasured area for which the three-dimensional information could not be obtained, into the measured area and the unmeasured area based on the distance-indicating information, the positions of the plurality of fingers being decided based on positions of the unmeasured area.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: February 14, 2023
    Assignee: OMRON Corporation
    Inventors: Yuki Nishina, Yoshinori Konishi
  • Patent number: 11577388
    Abstract: Apparatus, systems, methods, and articles of manufacture for automatic robot perception programming by imitation learning are disclosed. An example apparatus includes a percept mapper to identify a first percept and a second percept from data gathered from a demonstration of a task and an entropy encoder to calculate a first saliency of the first percept and a second saliency of the second percept. The example apparatus also includes a trajectory mapper to map a trajectory based on the first percept and the second percept, the first percept skewed based on the first saliency, the second percept skewed based on the second saliency. In addition, the example apparatus includes a probabilistic encoder to determine a plurality of variations of the trajectory and create a collection of trajectories including the trajectory and the variations of the trajectory.
    Type: Grant
    Filed: June 27, 2019
    Date of Patent: February 14, 2023
    Assignee: Intel Corporation
    Inventors: David I. Gonzalez Aguirre, Javier Felip Leon, Javier Sebastián Turek, Luis Carlos Maria Remis, Ignacio Javier Alvarez, Justin Gottschlich
  • Patent number: 11517168
    Abstract: Provided is a robot cleaner using an artificial intelligence (AI) algorithm and/or a machine learning algorithm in a 5G environment connected for Internet of Things (IoT). The robot cleaner includes one or more sensors, a driving wheel, a suction blower, and a controller, and the controller defines a cleaning target area, identifies a user's location and a type of the user's behavior, collects life pattern information of the user including the user's location, the type of the user's behavior, and timestamps each associated therewith during the time period of one day or more, determines a cleaning schedule of the robot cleaner based on the collected life pattern information, and controls the driving wheel and the suction blower so as to perform cleaning in accordance with the determined cleaning schedule.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: December 6, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Eu Gene Kim, Kwan Young Son, Hyun Seob Lee
  • Patent number: 11520571
    Abstract: The present system is a software defined manufacturing (SDM) system that integrates several technologies and methods into a system that automates the process of engineering and operating automated manufacturing systems (aka “automating automation”). In one embodiment, some or all of the below aspects of the “automating automation” system are integrated: modular, configurable, reusable manufacturing cells; computer vision systems; autocalibration systems; a recipe-based programming environment; configuration management system; production analytics; and a marketplace for sharing recipes.
    Type: Grant
    Filed: November 12, 2020
    Date of Patent: December 6, 2022
    Assignee: Bright Machines, Inc.
    Inventors: Brian Philip Mathews, Ronald Poelman
  • Patent number: 11504849
    Abstract: The present teaching relates to a method and system for path planning. A target is tracked via one or more sensors. Information of a desired pose of an end-effector with respect to the target and a current pose of the end-effector is obtained. Also, a minimum distance permitted between an arm including the end-effector and each of at least one obstacle identified between the current pose of the end-effector and the target is obtained. A weighting factor previously learned is retrieved and a cost based on a cost function is computed in accordance with a weighted smallest distance between the arm including the end-effector and the at least one obstacle, wherein the smallest distance is weighted by the weighting factor. A trajectory is computed from the current pose to the desired pose by minimizing the cost function.
    Type: Grant
    Filed: November 22, 2019
    Date of Patent: November 22, 2022
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11504846
    Abstract: The present invention relates to a robot teaching system based on image segmentation and surface electromyography and robot teaching method thereof, comprising a RGB-D camera, a surface electromyography sensor, a robot and a computer, wherein the RGB-D camera collects video information of robot teaching scenes and sends to the computer; the surface electromyography sensor acquires surface electromyography signals and inertial acceleration signals of the robot teacher, and sends to the computer; the computer recognizes a articulated arm and a human joint, detects a contact position between the articulated arm and the human joint, and further calculates strength and direction of forces rendered from a human contact position after the human joint contacts the articulated arm, and sends a signal controlling the contacted articulated arm to move along with such a strength and direction of forces and robot teaching is done.
    Type: Grant
    Filed: January 20, 2021
    Date of Patent: November 22, 2022
    Assignee: QINGDAO UNIVERSITY OF TECHNOLOGY
    Inventors: Chengjun Chen, Yong Pan, Dongnian Li, Zhengxu Zhao, Jun Hong
  • Patent number: 11507103
    Abstract: A method of obstacle handling for a mobile automation apparatus includes: obtaining an initial localization of the mobile automation apparatus in a frame of reference; detecting an obstacle by one or more sensors disposed on the mobile automation apparatus; generating and storing an initial location of the obstacle in the frame of reference, based on (i) the initial localization, and (ii) a detected position of the obstacle relative to the mobile automation apparatus; obtaining a correction to the initial localization of the mobile automation apparatus; and applying a positional adjustment, based on the correction, to the initial position of the obstacle to generate and store an updated position of the obstacle.
    Type: Grant
    Filed: December 4, 2019
    Date of Patent: November 22, 2022
    Assignee: Zebra Technologies Corporation
    Inventors: Peter Arandorenko, Sadegh Tajeddin, Zi Cong Guo
  • Patent number: 11500391
    Abstract: The present invention relates to a method for positioning on the basis of vision information and a robot implementing the method. The method for positioning on the basis of vision information, according to an embodiment of the present invention, comprises the steps of: generating, by a control unit of a robot, first vision information by using image information of an object sensed by controlling a vision sensing unit of a sensor module of the robot; generating, by the control unit of the robot, a vision-based candidate position by matching the first vision information with second vision information stored in a vision information storage unit of a map storage unit; and generating, by the control unit, the vision-based candidate position as the position information of the robot when there is one vision-based candidate position.
    Type: Grant
    Filed: May 14, 2018
    Date of Patent: November 15, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Byungkon Sohn, Jungmin Shim, Kyuchun Choi
  • Patent number: 11498214
    Abstract: A teaching device constructs, in a virtual space, a virtual robot system in which a virtual 3D model of a robot and a virtual 3D model of a peripheral structure of the robot are arranged, and teaches a moving path of the robot. The teaching device includes an acquisition unit configured to acquire information about a geometric error between the virtual 3D models, and a correction unit configured to correct the moving path of the robot in accordance with the information acquired by the acquisition unit.
    Type: Grant
    Filed: January 3, 2020
    Date of Patent: November 15, 2022
    Assignee: Canon Kabushiki Kaisha
    Inventors: Yasusato Fujieda, Hisashi Matsumoto
  • Patent number: 11449051
    Abstract: A method for managing movements of a fleet of a plurality of autonomous mobile objects capable of communicating with a management server. An autonomous mobile object is associated with a user terminal with which a user is equipped. The management server performs the following steps, for a given autonomous mobile object, referred to as a first object, associated with a first user terminal with which a user is equipped: obtaining information representing a position to be reached by the first object, referred to as first object destination information; determining information representing a movement to be made by the first object, referred to as first object movement information, the determining taking into account of at least the first object destination information; and transmitting the first object movement information to the first object.
    Type: Grant
    Filed: May 9, 2017
    Date of Patent: September 20, 2022
    Assignee: IFOLLOW
    Inventor: Vincent Jacquemart
  • Patent number: 11425888
    Abstract: An animal farm system includes a barn, animal related structures within the barn, such as a feeding alley and/or a milking system, and an autonomous vehicle arranged to perform an animal related action and move about in the barn. The vehicle includes a control unit to move the vehicle about, a position determining system for determining a position of the vehicle in the barn, a sensor system to determine a value of a parameter related to a position of the vehicle with respect to the barn or an object therein, such as the at least one structure therein, and a vehicle communication device. The control unit further is arranged to contain barn map information, and receive motion control and navigation information via the vehicle communication device.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: August 30, 2022
    Assignee: LELY PATENT N.V.
    Inventors: Martinus Cornelis Johannes Buijs, Mauro Brenna
  • Patent number: 11413749
    Abstract: Drive unit of an automation component, in particular a gripping, clamping, changing, linear or pivoting unit, whereby the drive unit includes a drive for driving the movable parts of the automation component and a control unit which controls the drive, whereby the control unit includes at least one computing device, and the drive unit together with the drive, control unit and computing device is arranged in or on a base housing of the automation component.
    Type: Grant
    Filed: May 23, 2018
    Date of Patent: August 16, 2022
    Assignee: SCHUNK GmbH & Co. KG Spann-und Greiftechnik
    Inventors: Ralf Becker, Joern Rastetter, Alexander Kupsch, Michael Ohlheiser