Programmed Data (e.g., Path) Modified By Sensed Data Patents (Class 700/253)
  • Patent number: 11520571
    Abstract: The present system is a software defined manufacturing (SDM) system that integrates several technologies and methods into a system that automates the process of engineering and operating automated manufacturing systems (aka “automating automation”). In one embodiment, some or all of the below aspects of the “automating automation” system are integrated: modular, configurable, reusable manufacturing cells; computer vision systems; autocalibration systems; a recipe-based programming environment; configuration management system; production analytics; and a marketplace for sharing recipes.
    Type: Grant
    Filed: November 12, 2020
    Date of Patent: December 6, 2022
    Assignee: Bright Machines, Inc.
    Inventors: Brian Philip Mathews, Ronald Poelman
  • Patent number: 11517168
    Abstract: Provided is a robot cleaner using an artificial intelligence (AI) algorithm and/or a machine learning algorithm in a 5G environment connected for Internet of Things (IoT). The robot cleaner includes one or more sensors, a driving wheel, a suction blower, and a controller, and the controller defines a cleaning target area, identifies a user's location and a type of the user's behavior, collects life pattern information of the user including the user's location, the type of the user's behavior, and timestamps each associated therewith during the time period of one day or more, determines a cleaning schedule of the robot cleaner based on the collected life pattern information, and controls the driving wheel and the suction blower so as to perform cleaning in accordance with the determined cleaning schedule.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: December 6, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Eu Gene Kim, Kwan Young Son, Hyun Seob Lee
  • Patent number: 11504846
    Abstract: The present invention relates to a robot teaching system based on image segmentation and surface electromyography and robot teaching method thereof, comprising a RGB-D camera, a surface electromyography sensor, a robot and a computer, wherein the RGB-D camera collects video information of robot teaching scenes and sends to the computer; the surface electromyography sensor acquires surface electromyography signals and inertial acceleration signals of the robot teacher, and sends to the computer; the computer recognizes a articulated arm and a human joint, detects a contact position between the articulated arm and the human joint, and further calculates strength and direction of forces rendered from a human contact position after the human joint contacts the articulated arm, and sends a signal controlling the contacted articulated arm to move along with such a strength and direction of forces and robot teaching is done.
    Type: Grant
    Filed: January 20, 2021
    Date of Patent: November 22, 2022
    Assignee: QINGDAO UNIVERSITY OF TECHNOLOGY
    Inventors: Chengjun Chen, Yong Pan, Dongnian Li, Zhengxu Zhao, Jun Hong
  • Patent number: 11504849
    Abstract: The present teaching relates to a method and system for path planning. A target is tracked via one or more sensors. Information of a desired pose of an end-effector with respect to the target and a current pose of the end-effector is obtained. Also, a minimum distance permitted between an arm including the end-effector and each of at least one obstacle identified between the current pose of the end-effector and the target is obtained. A weighting factor previously learned is retrieved and a cost based on a cost function is computed in accordance with a weighted smallest distance between the arm including the end-effector and the at least one obstacle, wherein the smallest distance is weighted by the weighting factor. A trajectory is computed from the current pose to the desired pose by minimizing the cost function.
    Type: Grant
    Filed: November 22, 2019
    Date of Patent: November 22, 2022
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11507103
    Abstract: A method of obstacle handling for a mobile automation apparatus includes: obtaining an initial localization of the mobile automation apparatus in a frame of reference; detecting an obstacle by one or more sensors disposed on the mobile automation apparatus; generating and storing an initial location of the obstacle in the frame of reference, based on (i) the initial localization, and (ii) a detected position of the obstacle relative to the mobile automation apparatus; obtaining a correction to the initial localization of the mobile automation apparatus; and applying a positional adjustment, based on the correction, to the initial position of the obstacle to generate and store an updated position of the obstacle.
    Type: Grant
    Filed: December 4, 2019
    Date of Patent: November 22, 2022
    Assignee: Zebra Technologies Corporation
    Inventors: Peter Arandorenko, Sadegh Tajeddin, Zi Cong Guo
  • Patent number: 11500391
    Abstract: The present invention relates to a method for positioning on the basis of vision information and a robot implementing the method. The method for positioning on the basis of vision information, according to an embodiment of the present invention, comprises the steps of: generating, by a control unit of a robot, first vision information by using image information of an object sensed by controlling a vision sensing unit of a sensor module of the robot; generating, by the control unit of the robot, a vision-based candidate position by matching the first vision information with second vision information stored in a vision information storage unit of a map storage unit; and generating, by the control unit, the vision-based candidate position as the position information of the robot when there is one vision-based candidate position.
    Type: Grant
    Filed: May 14, 2018
    Date of Patent: November 15, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Byungkon Sohn, Jungmin Shim, Kyuchun Choi
  • Patent number: 11498214
    Abstract: A teaching device constructs, in a virtual space, a virtual robot system in which a virtual 3D model of a robot and a virtual 3D model of a peripheral structure of the robot are arranged, and teaches a moving path of the robot. The teaching device includes an acquisition unit configured to acquire information about a geometric error between the virtual 3D models, and a correction unit configured to correct the moving path of the robot in accordance with the information acquired by the acquisition unit.
    Type: Grant
    Filed: January 3, 2020
    Date of Patent: November 15, 2022
    Assignee: Canon Kabushiki Kaisha
    Inventors: Yasusato Fujieda, Hisashi Matsumoto
  • Patent number: 11449051
    Abstract: A method for managing movements of a fleet of a plurality of autonomous mobile objects capable of communicating with a management server. An autonomous mobile object is associated with a user terminal with which a user is equipped. The management server performs the following steps, for a given autonomous mobile object, referred to as a first object, associated with a first user terminal with which a user is equipped: obtaining information representing a position to be reached by the first object, referred to as first object destination information; determining information representing a movement to be made by the first object, referred to as first object movement information, the determining taking into account of at least the first object destination information; and transmitting the first object movement information to the first object.
    Type: Grant
    Filed: May 9, 2017
    Date of Patent: September 20, 2022
    Assignee: IFOLLOW
    Inventor: Vincent Jacquemart
  • Patent number: 11425888
    Abstract: An animal farm system includes a barn, animal related structures within the barn, such as a feeding alley and/or a milking system, and an autonomous vehicle arranged to perform an animal related action and move about in the barn. The vehicle includes a control unit to move the vehicle about, a position determining system for determining a position of the vehicle in the barn, a sensor system to determine a value of a parameter related to a position of the vehicle with respect to the barn or an object therein, such as the at least one structure therein, and a vehicle communication device. The control unit further is arranged to contain barn map information, and receive motion control and navigation information via the vehicle communication device.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: August 30, 2022
    Assignee: LELY PATENT N.V.
    Inventors: Martinus Cornelis Johannes Buijs, Mauro Brenna
  • Patent number: 11413749
    Abstract: Drive unit of an automation component, in particular a gripping, clamping, changing, linear or pivoting unit, whereby the drive unit includes a drive for driving the movable parts of the automation component and a control unit which controls the drive, whereby the control unit includes at least one computing device, and the drive unit together with the drive, control unit and computing device is arranged in or on a base housing of the automation component.
    Type: Grant
    Filed: May 23, 2018
    Date of Patent: August 16, 2022
    Assignee: SCHUNK GmbH & Co. KG Spann-und Greiftechnik
    Inventors: Ralf Becker, Joern Rastetter, Alexander Kupsch, Michael Ohlheiser
  • Patent number: 11400607
    Abstract: An image processing device includes an acquirer, a face detector, and a change detector. The acquirer acquires images captured in succession by an image capturer. The face detector executes a first detection processing (face detection processing) that detects a certain target (face of a person) from a central area of the acquired image. The change detector executes second detection processing to detect a change over time in a surrounding area other than the central area in the acquired image. The second detection processing has a smaller processing load required per unit pixel in the image than that of the first detection processing.
    Type: Grant
    Filed: September 13, 2019
    Date of Patent: August 2, 2022
    Assignee: CASIO COMPUTER CO., LTD.
    Inventor: Tetsuji Makino
  • Patent number: 11358274
    Abstract: An autonomous mobile robot comprising a body and a display screen coupled to the body and configured to display information. A wheel control system is coupled to the body and configured to move the body in a given direction. At least one of a camera, an onboard UWB device, or a sensor is coupled to the body. A central processing unit is in communication with the wheel control system, the display screen, and the at least one camera, onboard UWB device, or sensor. The central processing unit is configured to adjust the orientation of the display screen relative to a user based on information received from the at least one camera, onboard UWB device, or sensor.
    Type: Grant
    Filed: June 13, 2019
    Date of Patent: June 14, 2022
    Assignee: LINGDONG TECHNOLOGY (BEIJING) CO. LTD
    Inventor: Herrickhui Yaozhang
  • Patent number: 11318606
    Abstract: The invention relates to a method for controlling an automation assembly which has a robot assembly with at least one robot (10) and a detection means assembly with at least one detection means (21-23), said method having the following at least partly automated steps: providing (S10) a first sequence of first ordinate data (q1, q2, dq2/dt, ?1, ?2) assigned to successive first abscissa points (t) on the basis of first training data (q1, q2, ?1, X2); identifying (S20) a first event point (tE) within the first abscissa points of the first sequence; and determining (S30) a first event criterion on the basis of the first sequence and the first event point.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: May 3, 2022
    Assignee: KUKA Deutschland GmbH
    Inventor: Helmuth Radrich
  • Patent number: 11305427
    Abstract: A robot system includes a robot, an image acquisition part, an image prediction part, and an operation controller. The image acquisition part acquires a current image captured by a robot camera arranged to move with the end effector. The image prediction part predicts a next image to be captured by the robot camera based on a teaching image model and the current image. The teaching image model is constructed by a machine learning of teaching image which is predicted to capture by the robot camera while the movable part performs an adjustment operation. The operation controller calculates the command value for operating the movable part so that the image captured by the robot camera approaches the next image, and controls the movable part based on the command value.
    Type: Grant
    Filed: November 28, 2018
    Date of Patent: April 19, 2022
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Hitoshi Hasunuma, Masayuki Enomoto
  • Patent number: 11310417
    Abstract: A vision-based state estimation framework to estimate the state of an underwater remotely operated vehicle (ROV) used for inspection of an underwater structure, for example, a nuclear reactor pressure vessel. The framework employs an external overhead, pan-tilt-zoom (PTZ) camera as the primary sensing modality and incorporates prior knowledge of the geometry of the structure.
    Type: Grant
    Filed: May 3, 2018
    Date of Patent: April 19, 2022
    Assignee: CARNEGIE MELLON UNIVERSITY
    Inventors: Nathan Michael, Tabitha Edith Lee, Curtis Boirum
  • Patent number: 11295534
    Abstract: Systems, methods, and devices for color and texture rendering of a space. A method includes receiving an image comprising an object and receiving an indication of a color. The method includes identifying the object within the image and defining a texture of the object. The method includes selecting a stored texture file stored in a database based on the texture of the object. The method includes merging the stored texture file and the color to generate a color and texture placement that can be implemented in a rendered scene.
    Type: Grant
    Filed: June 12, 2020
    Date of Patent: April 5, 2022
    Inventor: Jeremiah Timberline Barker
  • Patent number: 11285613
    Abstract: The present disclosure provides a robot visual image feature extraction method as well as an apparatus and a robot using the same. The method includes: collecting image data through visual sensor(s) of the robot, and collecting angular velocity data through inertial sensor(s) of the robot; calculating a relative pose between image frames in the image data based on the angular velocity data; extracting feature points of the first image frame in the image data; calculating a projection position of each feature point of the k-th image frame in the k+1-th image frame based on a relative pose between the k-th image frame and the k+1-th image frame; and searching for each feature point in the projection position in the k+1-th image frame, and performing a synchronous positioning and a mapping based on the searched feature point. In this manner, the feature points of dynamic objects are eliminated.
    Type: Grant
    Filed: November 29, 2019
    Date of Patent: March 29, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chenchen Jiang, Youjun Xiong, Longbiao Bai, Simin Zhang, Jianxin Pang
  • Patent number: 11287824
    Abstract: An example autonomous device is configured to move within a space. The autonomous device includes a first system to detect a first location of the autonomous device within the space, with the first location being based on a first fixed reference; a second system to detect a second location of the autonomous device within the space, with the second location being based on a second fixed reference; and a third system to detect a third location of the autonomous device within the space based on relative movements of the autonomous device. One or more processing devices are configured to select one of the first location or the second location based on reliability of at least one of the first location or the second location, and to control movement of the autonomous device using an estimated location that is based on the third location and the selected one of the first location or the second location.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: March 29, 2022
    Assignee: MOBILE INDUSTRIAL ROBOTS A/S
    Inventors: Niels Jul Jacobsen, Søren Eriksen Nielsen
  • Patent number: 11281228
    Abstract: A method and a device for determining a position of a transportation vehicle. A preliminary position of the transportation vehicle is determined, objects are detected in an area surrounding the transportation vehicle wherein the objects have at least visible and virtual building corners, the detected objects are transferred into a local coordinate system and are assigned to a map with objects based on the preliminary position of the transportation vehicle, and the position of the transportation vehicle is determined based on an equalization calculus in which the objects in the map which are most suitable for the objects transferred into the local coordinate system are ascertained.
    Type: Grant
    Filed: June 11, 2019
    Date of Patent: March 22, 2022
    Inventors: Christian Merfels, Moritz Schack
  • Patent number: 11256917
    Abstract: A moving body tracks a target in conjunction with at least one sensing device. The moving body includes a sensor, a communication device, a processor, and a memory in which a computer program is stored. The processor executes the computer program to calculate a position of the target by analyzing an output of the sensor, estimate a region in which the target is present using a last position of the target calculated when sight of the target is lost or a position obtained by analyzing the output of the sensor after losing the sight of the target, and a movement history of the target or its own device until the sight of the target is lost, and instruct the at least one sensing device selected according to the region to receive a result of the search from the sensing device.
    Type: Grant
    Filed: March 13, 2018
    Date of Patent: February 22, 2022
    Assignee: NIDEC CORPORATION
    Inventor: Mitsuhiro Amo
  • Patent number: 11243540
    Abstract: Methods and apparatus related to autonomous vehicles (AVs) are provided. A mapping can be determined that tiles an environment having an AV using a plurality of cells; each cell having an environmental status. While the AV is in the environment: status data can be received relating to a location of the AV and obstacles at that location; environmental status for a cell can be updated based on the status data; a value for each cell can be determined based on the cell's environmental status; a waypoint of a coverage path that covers a region in the environment and is based on the AV's location can be determined; a determination whether the waypoint is reachable from the AV's location can be made; after determining that the waypoint is reachable, a command based on the mapping can be sent directing the AV toward the waypoint; and the waypoint can be updated.
    Type: Grant
    Filed: May 14, 2019
    Date of Patent: February 8, 2022
    Assignee: University of Connecticut
    Inventors: Shalabh Gupta, Junnan Song
  • Patent number: 11243501
    Abstract: Vibration of a machine end and an error of a moving trajectory are suppressed. A machine learning device performs machine learning of optimizing first coefficients of a filter provided in a motor controller that controls a motor and second coefficients of a velocity feedforward unit of a servo control unit provided in the motor controller on the basis of an evaluation function which is a function of measurement information after acceleration and deceleration by an external measuring instrument provided outside the motor controller, a position command input to the motor controller, and a position error which is a difference between the position command value and feedback position detection value from a detector of the servo control unit.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: February 8, 2022
    Assignee: FANUC CORPORATION
    Inventors: Ryoutarou Tsuneki, Satoshi Ikai
  • Patent number: 11243532
    Abstract: A set of actions corresponding to a particular state of the environment of a vehicle is identified. A respective encoding is generated for different actions of the set, using elements such as distinct colors to distinguish attributes such as target lane segments. Using the encodings as inputs to respective instances of a machine learning model, respective value metrics are estimated for each of the actions. One or more motion-control directives to implement a particular action selected using the value metrics are transmitted to motion-control subsystems of the vehicle.
    Type: Grant
    Filed: September 26, 2018
    Date of Patent: February 8, 2022
    Assignee: Apple Inc.
    Inventors: Martin Levihn, Pekka Tapani Raiko
  • Patent number: 11235461
    Abstract: A machine learning device is provided in a versatile controller capable of inferring command data to be issued to each axis of a robot. The device includes an axis angle conversion unit calculating, from the trajectory data, an amount of change of an axis angle of an axis of the robot, a state observation unit observing axis angle data relating to the amount of change of the axis angle of the axis of the robot as a state variable representing a current state of an environment, a label data acquisition unit acquiring axis angle command data relating to command data for the axis of the robot as label data, and a learning unit learning the amount of change of the axis angle of the axis of the robot and the command data for the axis in association with each other by using the state variable and the label data.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: February 1, 2022
    Assignee: Fanuc Corporation
    Inventor: Kouichirou Hayashi
  • Patent number: 11231707
    Abstract: A method of mapping an area to be mowed with an autonomous mowing robot comprises receiving mapping data from a robot lawnmower, the mapping data specifying an area to be mowed and a plurality of locations of beacons positioned within the area to be mowed, and receiving at least first and second geographic coordinates for first and second reference points that are within the area and are specified in the mapping data. The mapping data is aligned to a coordinate system of a map image of the area using the first and second geographic coordinates. The map image is displayed based on aligning the mapping data to the coordinate system.
    Type: Grant
    Filed: April 29, 2019
    Date of Patent: January 25, 2022
    Assignee: iRobot Corporation
    Inventors: Paul C. Balutis, Andrew Beaulieu, Brian Yamauchi, Karl Jeffrey Karlson, Dominic Hugh Jones
  • Patent number: 11227583
    Abstract: The present disclosure includes customizing responses by an Artificial Intelligence (AI) system using a response mode for interaction with a user. A question or command is received at an AI system from an associated AI device which receives the question or command from an initiating user of a plurality of users in a vicinity of the AI device. A preference of an interaction mode for the initiating user is determined, and the preferred interaction mode is determined using a knowledge corpus. An answer to the question or command using the AI system is generated. Using the AI device, a communication to the initiating user which includes the answer is initiated, via a communication mode based on the interaction mode preference of the initiating user.
    Type: Grant
    Filed: November 5, 2019
    Date of Patent: January 18, 2022
    Assignee: International Business Machines Corporation
    Inventors: Sarbajit K. Rakshit, Christian Compton, Jeremy R. Fox, Trudy L. Hewitt
  • Patent number: 11226621
    Abstract: Embodiments included herein are directed towards a robotic system and method. Embodiments may include a transportation mechanism having at least three legs and a computing device configured to receive a plurality of optimization components. Each optimization component may include a plurality of variables and the computing device may be further configured to perform a randomized simulation based upon, at least in part, each of the plurality of optimization components. The computing device may be further configured to provide one or more results of the randomized simulation to the transportation mechanism to enable locomotion via the at least three legs.
    Type: Grant
    Filed: February 14, 2019
    Date of Patent: January 18, 2022
    Assignee: TERADYNE, INC.
    Inventors: Ryan S. Penning, James D. English, Douglas E. Barker, Brett L. Limone, Paul Muench
  • Patent number: 11214437
    Abstract: An autonomous mobile robotic device that may carry and transport one or more items within an environment. The robotic device may comprise a platform on which the one or more items may be placed. The robotic device may pick up, deliver, distribution and/or transport the one or more items to one or more locations. The robotic device may be provided with scheduling information for task execution or for pick up, delivery, distribution and/or transportation of one or more items. Once tasks are complete, the robotic device may autonomously navigate to a storage location.
    Type: Grant
    Filed: September 10, 2018
    Date of Patent: January 4, 2022
    Assignee: AI Incorporated
    Inventor: Ali Ebrahimi Afrouzi
  • Patent number: 11207780
    Abstract: A path planning apparatus is provided with a path planning unit that generates a path of a robot using a plurality of different path planning methods that respectively correspond to a plurality of different constraints determined from the posture of the robot and the characteristics of one or more obstacles that obstruct movement of the robot, an acquisition unit that acquires posture information indicating an initial posture of a robot for which a path is to be generated and a target posture of the robot, and obstacle information indicating a target obstacle that obstructs movement of the robot from the initial posture to the target posture, and a controller that controls the path planning unit so as to generate a path of the robot using a path planning method corresponding to a constraint determined from the posture information and the obstacle information acquired by the acquisition unit.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: December 28, 2021
    Assignee: OMRON Corporation
    Inventors: Toshihiro Moriya, Kennosuke Hayashi, Akane Nakashima, Takeshi Kojima, Haruka Fujii
  • Patent number: 11203121
    Abstract: A control device of robot includes: image data acquirer configured to acquire image data taken by camera, image data including a teaching substrate and a substrate placing portion of a hand, the teaching substrate arranged as a teaching target at substrate target position; a virtual substrate information generator configured to generate information of a virtual substrate virtually arranged at the substrate placing portion of the hand in image data of the camera; distance information calculator configured to calculate distance information from substrate placing portion to the teaching substrate based on image data of the camera; operation control unit configured to control operation of robot arm based on distance information from the substrate placing portion to the teaching substrate such that the virtual substrate coincides with the teaching substrate; and storage unit configured to store, as teaching data, position of the hand when the virtual substrate coincides with the teaching substrate.
    Type: Grant
    Filed: September 26, 2017
    Date of Patent: December 21, 2021
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Hirohiko Goto, Tetsuya Yoshida, Haruhiko Tan, Kazuo Fujimori, Katsuhiro Yamashita, Masahiko Sumitomo
  • Patent number: 11199853
    Abstract: Provided is a tangible, non-transitory, machine readable medium storing instructions that when executed by a processor effectuates operations including: capturing, with at least one exteroceptive sensor, readings of an environment and capturing, with at least one proprioceptive sensor, readings indicative of displacement of a wheeled device; estimating, with the processor using an ensemble of simulated positions of possible new locations of the wheeled device, the readings of the environment, and the readings indicative of displacement, a corrected position of the wheeled device to replace a last known position of the wheeled device; determining, by the processor using the readings of the exteroceptive sensor, a most feasible position of the wheeled device as the corrected position; and, transmitting, by the processor, status information of tasks performed by the wheeled device to an external processor, wherein the status information initiates a second wheeled device to perform a second task.
    Type: Grant
    Filed: July 11, 2019
    Date of Patent: December 14, 2021
    Assignee: AI Incorporated
    Inventors: Ali Ebrahimi Afrouzi, Lukas Fath, Chen Zhang, Brian Highfill, Amin Ebrahimi Afrouzi, Shahin Fathi Djalali, Masih Ebrahimi Afrouzi, Azadeh Afshar Bakooshli
  • Patent number: 11172605
    Abstract: An autonomous lawn mower (100) and a system for navigating the same are disclosed. The autonomous lawn mower (100) comprises a mower body (102) having at least one motor (210) arranged to drive a cutting blade (212b) and to propel the mower body (102) on an operating surface via a wheel arrangement. The mower body (102) includes a navigation system (204) arranged to assist a controller (202) to control the operation of the mower body (102) within the predefined operating area (208). The system comprises a plurality of navigation modules (202), each arranged to obtain individual navigation information associated with the navigation of the autonomous mower (100). The navigation modules (202) operate to generate a virtual representation of the operation area (208) of the lawn mower (100) during an initialization mode, and the virtual representation of the operation area (208) is processed with the obtained navigation information during the operation of the lawn mower (100).
    Type: Grant
    Filed: April 26, 2017
    Date of Patent: November 16, 2021
    Assignee: TTI (MACAO COMMERCIAL OFFSHORE) LIMITED
    Inventors: Klaus Hahn, Todd Brandon Rickey, Benjamin Edgar Montgomery, Nikolai Ensslen, Levent Yildirim
  • Patent number: 11167811
    Abstract: A robotic obstacle-crossing device mainly includes a wheel body and an obstacle-crossing body, wherein the wheel body includes a wheel part, a first obstacle-crossing part and a second obstacle-crossing part. When the sweeping robot tilts, the plurality of first recessed portions and the plurality of second recessed portions provided on the periphery of the first obstacle-crossing part and the second obstacle-crossing part provide a climbing function. In addition, when the sweeping robot encounters obstacles or steps, the obstacle-crossing body can provide robot the function of the obstacle-crossing or climbing, thereby reducing the number of situations when the sweeping robot is trapped or unable to effectively climb upon encountering an obstacle or a steep road surface.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: November 9, 2021
    Inventor: Jason Yan
  • Patent number: 11151281
    Abstract: The disclosure relates to a video monitoring method for mobile robot. The method includes the following steps: providing one or more private area on a basis of a map constructed by a mobile robot; determining, on a basis of a position of the mobile robot and an image shot from a shooting angle of a camera, whether an image shot currently contains private contents; if the image shot currently contains private contents, shielding the private contents contained in the image; and if the image shot currently is not containing private contents, continuing monitoring the mobile robot. By providing one or more private area in a map constructed by a mobile robot, the mobile robot can determine whether a shot image contains private contents during the subsequent video monitoring process, so as to shield the private contents in the image, thereby ensuring the security in video monitoring of the mobile robot, and avoiding privacy leakage.
    Type: Grant
    Filed: August 6, 2018
    Date of Patent: October 19, 2021
    Assignee: AMICRO SEMICONDUCTOR CO., LTD.
    Inventor: Qinwei Lai
  • Patent number: 11144028
    Abstract: The disclosure is provided to transmit a sensor value from a sensor to an external device with high efficiency. A sensor value of a sensor is acquired, basic data as time-series data is generated with reference to the acquired sensor value, differential data indicating a difference between the basic data and measurement data as time-series data corresponding to the sensor value acquired from the sensor is generated, and the differential data is transmitted to an external device through wireless communication.
    Type: Grant
    Filed: March 13, 2019
    Date of Patent: October 12, 2021
    Assignee: OMRON Corporation
    Inventors: Yoshimitsu Nakano, Hidekatsu Nogami, Yahiro Koezuka, Arata Kataoka
  • Patent number: 11135718
    Abstract: Robots, users, or a central controller may leverage Geo analytics and/or augmented reality to search for, discover, access and use robots. The robots may perform tasks to provide selective services on-demand within medicine, agriculture, military, entertainment, manufacturing, personal, or public safety, among other things.
    Type: Grant
    Filed: April 13, 2020
    Date of Patent: October 5, 2021
    Assignee: AT&T Intellectual Property I, L.P.
    Inventor: Venson Shaw
  • Patent number: 11112505
    Abstract: A robotic work tool system, comprising a robotic work tool, said robotic work tool comprising a position determining device for determining a current position and at least one deduced reckoning (also known as dead reckoning) navigation sensor, the robotic work tool being configured to determine that a reliable and accurate current position is possible to determine and in response thereto determine an expected navigation parameter, compare the expected navigation parameter to a current navigation parameter to determine a navigation error, determine if the navigation error is negligible, and if the navigation error is not negligible, cause the robotic work tool to change its trajectory to accommodate for the navigation error.
    Type: Grant
    Filed: February 27, 2020
    Date of Patent: September 7, 2021
    Assignee: HUSQVARNA AB
    Inventors: Magnus Öhrlund, Peter Reigo
  • Patent number: 11109731
    Abstract: A moving robot is provided. The moving robot includes: a main body forming the exterior appearance; a moving means for moving the main body; a bumper configured to protrude on the outer perimeter of the main body; impact sensors placed at an angle on the main body to sense the movement of the bumper; and pressure parts formed in a bent shape at the end of the impact sensors.
    Type: Grant
    Filed: January 19, 2018
    Date of Patent: September 7, 2021
    Assignee: LG Electronics Inc.
    Inventors: Jaewon Jang, Sungho Yoon
  • Patent number: 11104000
    Abstract: Aspects of the present disclosure include methods, apparatuses, and computer readable media of traversing an obstacle including receiving optical data associated with an area near the robot, identifying the obstacle in the area based on the optical data, deflating, in response to identifying the obstacle, a ball of the robot, and applying a downward force through the deflated ball to propel the robot over the obstacle.
    Type: Grant
    Filed: October 17, 2019
    Date of Patent: August 31, 2021
    Assignee: HONDA MOTOR CO., LTD.
    Inventor: Soshi Iba
  • Patent number: 11107344
    Abstract: A rescue system includes: a plurality of movable bodies each equipped with a camera; and a server configured to communicate with the plurality of movable bodies. The rescue system identifies a protection target, based on information acquired by the camera. The server is configured to (a) define a search area to be searched for the protection target, (b) acquire positional information about the plurality of movable bodies and select, from movable bodies located within the search area, at least one movable body to be used for searching for the protection target, the movable body being selected as a selected movable body, and (c) output, to the selected movable body, a search command for searching for the protection target.
    Type: Grant
    Filed: November 13, 2018
    Date of Patent: August 31, 2021
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventors: Hiroki Sawada, Masato Tamaoki, Eisuke Ando, Masato Endo, Kuniaki Hasegawa
  • Patent number: 11097742
    Abstract: The system includes an examination controller that performs control relating to medical examination of a client user in the mobile object, a controller configured to, decide whether or not to transport the client user to a medical facility, and create an operation command so as to cause the mobile object to pick up the client user at a first destination, which is a location based on information sent from a client user's terminal used by the client user, and when the controller decides to transport the client user to the medical facility after the mobile object moves to the first destination, to transport the client user aboard the mobile object to the medical facility as a second destination.
    Type: Grant
    Filed: December 19, 2018
    Date of Patent: August 24, 2021
    Assignee: Toyota Jidosha Kabushiki Kaisha
    Inventors: Isao Kanehara, Kazuhiro Umeda, Hideo Hasegawa, Tsuyoshi Okada, Shinjiro Nagasaki
  • Patent number: 11099568
    Abstract: A self-driving vehicle system comprising a body having one or more motorized wheels, and a sensor head coupled to the body. The sensor head is movable from a retracted position to an extended position relative to the body. The sensor head comprises one or more proximity sensors and one or more cameras.
    Type: Grant
    Filed: September 6, 2018
    Date of Patent: August 24, 2021
    Assignee: LINGDONG TECHNOLOGY (BEIJING) CO. LTD
    Inventors: Yaming Tang, Liang Han, Chiung Lin Chen
  • Patent number: 11086334
    Abstract: Systems and methods are provided for crowdsourcing a sparse map for autonomous vehicle navigation. In one implementation, a non-transitory computer-readable medium may include a sparse map for autonomous vehicle navigation along a road segment. The sparse map may include at least one line representation of a road surface feature extending along the road segment, each line representation representing a path along the road segment substantially corresponding with the road surface feature, and wherein the road surface feature is identified through image analysis of a plurality of images acquired as one or more vehicles traverse the road segment and a plurality of landmarks associated with the road segment.
    Type: Grant
    Filed: July 21, 2017
    Date of Patent: August 10, 2021
    Assignee: MOBILEYE VISION TECHNOLOGIES LTD.
    Inventor: Ofer Fridman
  • Patent number: 11079240
    Abstract: A method for localization of a mobile automation apparatus includes, at a navigational controller: generating a set of candidate poses within an environmental map; updating the candidate poses according to motion data corresponding to movement of the mobile automation apparatus; receiving observational data collected at an actual pose of the mobile automation apparatus; generating (i) respective weights for the candidate poses, each weight indicating a likelihood that the corresponding candidate pose matches the actual pose, and (ii) a localization confidence level based on the weights; responsive to determining whether the localization confidence level exceeds a candidate pose resampling threshold: when the determination is affirmative, (i) increasing the candidate pose resampling threshold and (ii) generating a further set of candidate poses; and when the determination is negative, (i) decreasing the candidate pose resampling threshold without generating the further set; and repeating the updating, the rece
    Type: Grant
    Filed: December 7, 2018
    Date of Patent: August 3, 2021
    Assignee: Zebra Technologies Corporation
    Inventors: Sadegh Tajeddin, Harsoveet Singh, Jingxing Qian
  • Patent number: 11072077
    Abstract: A robot system which includes a robot with a plurality of joints; a voice operation device to which a voice operation command is input by an operator; and a controller which controls an operation of the robot, the controller being configured to change the operation of the robot, in a case where the voice operation device detects a voice with a sound volume which is equal to or higher than a first predetermined sound volume which is preset, during the operation of the robot.
    Type: Grant
    Filed: December 2, 2016
    Date of Patent: July 27, 2021
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventor: Masayuki Watanabe
  • Patent number: 11061407
    Abstract: The present disclosure provides an equivalent trajectory generating method for a biped robot and a biped robot using the same. The method includes: obtaining a motion state of the biped robot by a position sensor; determining switching moments in an advancing direction of the biped robot, based on the motion state of the biped robot; finding the mass center position of the biped robot at each switching moment; connecting the mass center positions at the switching moments as an equivalent trajectory of the biped robot; and performing a closed loop control on the biped robot according to the equivalent trajectory. Through the method, the overall real-time position of the robot can be obtained according to the equivalent trajectory effectively, which is advantageous to perform a stable and reliable control to the biped robot according to the equivalent trajectory of the biped robot.
    Type: Grant
    Filed: December 23, 2018
    Date of Patent: July 13, 2021
    Assignee: UBTECH Robotics Corp
    Inventors: Youjun Xiong, Zheng Xie, Chunyu Chen, Yizhang Liu, Ligang Ge
  • Patent number: 11059176
    Abstract: A system for facility monitoring and reporting to improve safety using one or more robots includes: a network; a plurality of autonomous mobile robots operating in a facility, the robots configured to monitor facility operation, the robots further configured to detect a predetermined critical condition, the robots operably connected to the network; a server operably connected to the robots over the network; and an individual robot operably connected to the server over the network, the individual robot operating in the facility, the robots not comprising the individual robot, the individual robot configured to monitor facility operation; wherein the robots are configured to regularly produce a regular report under normal operating conditions, the report displaying data received from the server, wherein the robots are further configured to produce to the server a critical condition report upon occurrence of the critical condition.
    Type: Grant
    Filed: December 16, 2020
    Date of Patent: July 13, 2021
    Assignee: Fetch Robotics, Inc.
    Inventors: Niharika Arora, Melonee Wise, Brian Cairl, Carl Saldanha, Robert Chatman, III, Levon Avagyan, Aaron Hoy, Stefan Nusser, David Dymesich, David Robson
  • Patent number: 11003177
    Abstract: An apparatus including a combination possibility calculation unit to calculate a stable orientation in which, from three-dimensional shape data of a part, the part is stabilized on a flat surface, to calculate a grasping method for grasping the part with a hand, and to calculate a combination in which the hand does not interfere from system configuration data including information on a connection destination of the hand and a combination group of the grasping method and the stable orientation; a regrasping path calculation unit to calculate a regrasping path of the part by using the calculated combination; a path group calculation unit to calculate a path having the minimum number of teaching points from the regrasping path as a path group based on orientation data for designating an input orientation and an alignment orientation of the part; and a program generation unit to generate a program of a robot based on the path group.
    Type: Grant
    Filed: September 4, 2017
    Date of Patent: May 11, 2021
    Assignee: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Tatsuya Nagatani, Haruhisa Okuda
  • Patent number: 11000949
    Abstract: A control device includes a learning control part in which a difference is calculated between a target position and an actual position of a portion detected based on a sensor, and an operation-speed change rate is increased or reduced several times within a maximum value of the operation-speed change rate set for increasing or reducing the operation speed of a robot mechanism unit and within allowance conditions of vibrations occurring at the portion to be controlled; meanwhile, learning is repeated to calculate an updated compensation amount based on the difference and a previous compensation amount previously calculated for suppressing vibrations at each operation-speed change rate, and a convergent compensation amount and a convergent operation-speed change rate are stored after convergence of the compensation amount and the operation-speed change rate.
    Type: Grant
    Filed: February 21, 2018
    Date of Patent: May 11, 2021
    Assignee: FANUC CORPORATION
    Inventors: Shinichi Washizu, Hajime Suzuki, Kaimeng Wang
  • Patent number: 10997491
    Abstract: A method, device and system of prediction of a state of an object in the environment using an action model of a neural network. In accordance with one aspect, a control system for a object comprises a processor, a plurality of sensors coupled to the processor for sensing a current state of the object and an environment in which the object is located, and a first neural network coupled to the processor. One or more predicted subsequent states of the object in the environment are determined using an action model of the neural network and a current state of the object in the environment and an plurality of action sequences. The action model comprises a mapping of states of the object in the environment and actions performed by the object for each state to predicted subsequent states of the object in the environment.
    Type: Grant
    Filed: October 4, 2017
    Date of Patent: May 4, 2021
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Hengshuai Yao, Seyed Masoud Nosrati, Hao Chen, Peyman Yadmellat, Yunfei Zhang