Programmed Data (e.g., Path) Modified By Sensed Data Patents (Class 700/253)
  • Patent number: 11953888
    Abstract: A production cell includes: at least one robot arranged to handle products; at least one buffer area for intermediate storage of products inside the production cell; a vision system with cameras arranged to determine, based on images from the cameras, the identity and the location of objects in the production cell a plurality of production modules, each production module comprising at least one Hardware Module configured to process products; and a plurality of module attachment locations, each module attachment location being configured to connect with an interface section of a production module through at least a physical connection and a power connection.
    Type: Grant
    Filed: October 25, 2018
    Date of Patent: April 9, 2024
    Assignee: Festo SE & Co. KG
    Inventors: Alfons Riek, Curt-Michael Stoll, Hans Klingel, Marcel Aeschlimann, Samuel Malzach, Christian Schmid, Christoph Berger, Judith Wimmer, Ivo Aschwanden, Kilian Iannucci, Alexandra Krause, Markus Andreas Müller, Martin Helmer, Peter Barmettler
  • Patent number: 11951625
    Abstract: A control method for a robot includes a first working step of executing first work on a first working object by operating a robot arm by force control based on a predetermined position command value, a first memory step of storing first position information of a trajectory in which a control point set for the robot arm passes at the first working step, and a second working step of updating a position command value for the robot arm based on the first position information stored at the first memory step, and executing second work on a second working object by operating the robot arm by the force control based on an updated value as the updated position command value.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: April 9, 2024
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Nobuhiro Karito
  • Patent number: 11934196
    Abstract: A lawn vehicle network includes a charging station having a visual identifier, a lawn vehicle having a battery, a blade system, a drive system whose output effects lawn vehicle forward movement, a processor board connected to both systems, the processor board capable of processing image data and sending commands to both systems, and a vision assembly connected to the processor board and able to transmit image data to the processor board, and the processor board, having received the image data, able to, if the image data represent a first object, maintain the drive system's output at the time of that determination, if the image data represent a second object, change the drive system's output at the time of that determination, and if the image data represent the visual identifier, maintain the drive system's output or send a shutoff command to the vision assembly at the time of that determination.
    Type: Grant
    Filed: December 13, 2019
    Date of Patent: March 19, 2024
    Assignee: Hydro-Gear Limited Partnership
    Inventors: Damon J. Hoyda, Eric S. Phanco, John Tyler Hibbard, David H. Dunten
  • Patent number: 11906966
    Abstract: A robot detects, through a sensor, the location and movement direction of a user and an object near the user, sets a nearby ground area in front at the feet of the user according to the detected location and movement direction of the user, controls an illumination device in the robot to irradiate the nearby ground area with light while driving at least one pair of legs or wheels of the robot to cause the robot to accompany the user, specifies the type and the location of the detected object, and if the object is a dangerous object and is located ahead of the user, controls the illumination device to irradiate a danger area including at least a portion of the dangerous object with light in addition to irradiating the nearby ground area with light.
    Type: Grant
    Filed: October 21, 2022
    Date of Patent: February 20, 2024
    Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.
    Inventor: Hiroshi Yahata
  • Patent number: 11886190
    Abstract: A robot detects, through a sensor, the location and movement direction of a user and an object near the user, sets a nearby ground area in front at the feet of the user according to the detected location and movement direction of the user, controls an illumination device in the robot to irradiate the nearby ground area with light while driving at least one pair of legs or wheels of the robot to cause the robot to accompany the user, specifies the type and the location of the detected object, and if the object is a dangerous object and is located ahead of the user, controls the illumination device to irradiate a danger area including at least a portion of the dangerous object with light in addition to irradiating the nearby ground area with light.
    Type: Grant
    Filed: February 3, 2023
    Date of Patent: January 30, 2024
    Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.
    Inventor: Hiroshi Yahata
  • Patent number: 11858141
    Abstract: An impedance control method as well as a controller and a robot using the same are provided. The method includes: obtaining joint motion information and joint force information in the joint space of a robotic arm and an actual interaction force acting on an end-effector, and calculating actual motion information of the end-effector in the task space based on the joint motion information; calculating a corrected desired trajectory using environment information and a desired end-effector interaction force, and calculating the impedance control torque based on the joint force information, the actual interaction force, the actual motion information, and desired end-effector information including the corrected desired trajectory and determining a compensation torque based on a nonlinear term in a constructed dynamics equation so as to perform a joint torque control on the robotic arm based on the impedance control torque and the compensation torque.
    Type: Grant
    Filed: August 23, 2021
    Date of Patent: January 2, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Liqun Huang, Xiaoyu Ren, Mingguo Zhao, Youjun Xiong
  • Patent number: 11825342
    Abstract: Systems, apparatuses, and methods for reducing network bandwidth usage by a fleet of robots. According to at least one non-limiting exemplary embodiment, robots coupled to a server collect and produce a substantial amount of data, only a portion of that data being useful for operators to monitor behavior of the robot. The present disclosure provides for, inter alia, optimized systems, apparatuses, and methods for operators to extract the useful data using only reduced bandwidth of cellular LTE networks or Wi-Fi networks.
    Type: Grant
    Filed: September 2, 2021
    Date of Patent: November 21, 2023
    Assignee: Brain Corporation
    Inventors: Keith Chester, Daniel Sackinger
  • Patent number: 11801937
    Abstract: Systems and methods for autonomously herding birds in accordance with embodiments of the invention are illustrated. One embodiment includes an autonomous flock herding system, including a bird location sensor, a drone; and a control system, including a processor, and a memory, the memory containing a flock herding application, where the application directs the processor to obtain bird position data from the at least one bird location sensor, where the bird position data describes the location of birds in a flock of birds, determine if the flock of birds will enter a protected zone, generate a set of waypoints using a flock dynamics model, instruct the unmanned aerial vehicle to navigate to at least one waypoint in the set of waypoints such that the flock of birds will, in response to the presence of the unmanned aerial vehicle at the at least one waypoint, change trajectory away from the protected zone.
    Type: Grant
    Filed: July 26, 2019
    Date of Patent: October 31, 2023
    Assignees: California Institute of Technology, Imperial College Innovations Limited
    Inventors: Soon-Jo Chung, Aditya Paranjape, Kyunam Kim
  • Patent number: 11789455
    Abstract: Embodiments of the present application disclose a positioning method and apparatus, an autonomous driving vehicle, an electronic device and a storage medium, relating to the field of autonomous driving technologies, comprising: collecting first pose information measured by an inertial measurement unit within a preset time period, and collecting second pose information measured by a wheel tachometer within the time period; generating positioning information according to the first pose information, the second pose information and the adjacent frame images; controlling driving of the autonomous driving vehicle according to the positioning information. The positioning information is estimated by combining the first pose information and the second pose information corresponding to the inertial measurement unit and the wheel tachometer respectively. Compared with the camera, the inertial measurement unit and the wheel tachometer are not prone to be interfered by the external environment.
    Type: Grant
    Filed: December 29, 2020
    Date of Patent: October 17, 2023
    Assignees: Beijing Baidu Netcom Science Technology Co., Ltd., Apollo Intelligent Driving Technology (Beijing) Co., Ltd.
    Inventors: Wendong Ding, Xiaofei Rui, Gang Wang, Shiyu Song
  • Patent number: 11772265
    Abstract: A method for controlling movement of a robot includes inputting a first linear velocity parameter and a first angular velocity parameter, which are specified as the robot moves, into a first limit model for limiting a centripetal acceleration to correct the first linear velocity parameter and the first angular velocity parameter, thereby calculating a second linear velocity parameter and a second angular velocity parameter, inputting the second linear velocity parameter and the second angular velocity parameter into at least one of a second limit model for limiting a linear velocity and a third limit model for limiting an angular velocity to correct the second linear velocity parameter and the second angular velocity parameter, thereby calculating a third linear velocity parameter and a third angular velocity parameter, and controlling the movement of the robot based on the third linear velocity parameter and the third angular velocity parameter.
    Type: Grant
    Filed: August 20, 2020
    Date of Patent: October 3, 2023
    Assignee: Bear Robotics, Inc.
    Inventors: Bryant Leo Pong, Henry A. Leinhos, Sanghun Jung
  • Patent number: 11772277
    Abstract: An example robotic tool holder includes an actuator that is disposed within a housing and configured to hold a tool. The housing and the actuator are in contact via dowels to limit movement of the actuator toward a distal end of the housing. Ones of the dowels that are in contact are in line contact and the ones of the dowels that are in contact are in a triangular geometry. The pressure plate is in line contact with the actuator within the housing around a circumference of the pressure plate. The springs are in contact with the pressure plate to bias the actuator toward a proximal end of the housing via the pressure plate. The springs are in contact with the mounting plate opposite the pressure plate. The sensor switch detects a shock force on the actuator and outputs a signal in response to the shock force.
    Type: Grant
    Filed: April 28, 2020
    Date of Patent: October 3, 2023
    Assignee: Illinois Tool Works Inc.
    Inventor: Nauman Basit
  • Patent number: 11747822
    Abstract: A mobile delivery robot has at least one memory component containing at least map data; at least two cameras adapted to take visual images; and at least one processing component. The at least one processing component is adapted to at least extract straight lines from the visual images taken by the at least two cameras and compare them to the map data to at least localize the robot. The mobile robot employs a localization method which involves taking visual images with at least two cameras; extracting straight lines from the individual visual images with at least one processing component; comparing the extracted features with existing map data; and outputting a location hypothesis based on said comparison.
    Type: Grant
    Filed: June 10, 2021
    Date of Patent: September 5, 2023
    Assignee: STARSHIP TECHNOLOGIES OÜ
    Inventors: Ahti Heinla, Kalle-Rasmus Volkov, Lindsay Roberts, Indrek Mandre
  • Patent number: 11720100
    Abstract: Various embodiments include methods for improving navigation by a processor of a robotic device. Such embodiments may include initiating a start of a predetermined time period associated with semantic information extraction, and determining whether an adverse event related to one or more sensors of the robotic device is detected. Such embodiments may also include identifying a current time slot of the predetermined time period, identifying a current estimated position and orientation of the robotic device, and recording updates to semantic information stored for the one or more sensor based on the identified current time slot and the current estimated position and orientation of the robotic device in response to determining that an adverse event related to one or more sensors of the robotic device is detected.
    Type: Grant
    Filed: March 14, 2018
    Date of Patent: August 8, 2023
    Assignee: QUALCOMM Incorporated
    Inventors: Xiaohui Liu, Yibo Jiang, Jiangtao Ren, Lei Xu, Yanming Zou
  • Patent number: 11662735
    Abstract: The invention relates to a method for updating a control model for automatic control of at least one mobile unit. A central control unit generates a detection task and transmits same to the mobile unit. The mobile unit comprises sensors, and the detection task comprises conditions for detecting sensor data sets by means of the sensors. The mobile unit detects the sensor data sets by means of the sensors using the detection task, generates transmission data using the detected sensor data sets, and transmits the transmission data to the central control unit. The central control unit receives the transmission data and generates an updated control model using the received transmission data. The system according to the invention for updating a control model for automatic control of at least one mobile unit comprises a central control unit by means of which a detection task may be generated and transmitted to the mobile unit.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: May 30, 2023
    Assignee: VOLKSWAGEN AKTIENGESELLSCHAFT
    Inventors: Fabian Hüger, Peter Schlicht
  • Patent number: 11654553
    Abstract: A robot system according to an embodiment includes one or more processors. The processors acquire first input data predetermined as data affecting an operation of a robot. The processors calculate a calculation cost of inference processing using a machine learning model for inferring control data used for controlling the robot, on the basis of the first input data. The processors infer the control data by the machine learning model set according to the calculation cost. The processors control the robot using the inferred control data.
    Type: Grant
    Filed: February 25, 2020
    Date of Patent: May 23, 2023
    Assignee: KABUSHIKI KAISHA TOSHIBA
    Inventors: Shuhei Nitta, Atsushi Yaguchi, Yukinobu Sakata, Akiyuki Tanizawa, Yasutoyo Takeyama, Tomoki Watanabe
  • Patent number: 11634887
    Abstract: In a method of controlling construction machinery, a bucket of a working device is moved along a first excavation trajectory to perform an excavation operation on the ground of a work area. A digging force exerted on the bucket during the excavation operation is calculated. A new second excavation trajectory is generated based on the calculated digging force. The bucket is moved along the second excavation trajectory.
    Type: Grant
    Filed: November 19, 2020
    Date of Patent: April 25, 2023
    Assignee: DOOSAN INFRACORE CO., LTD.
    Inventors: Changmook Kim, Dongjun Lee, Changu Kim, Bukun Son
  • Patent number: 11613018
    Abstract: Systems and methods for automatic restocking different items in retail store environments having POS locations for the items are disclosed. The method includes, for a first item of the different items, storing at least one first item parameter uniquely identifying the first item. The method includes determining, based on the at least one first item parameter, a first autonomous movement control scheme for manipulation of the first item by a robotic arm. The method includes executing, by the robotic arm, the first control scheme, the executing including shelving the first item on the shelf. The method includes evaluating, by a processor or a user of the robotic arm, the executing for the first item according to at least one predetermined first performance criteria. The method includes determining and storing, based on the evaluating, an updated first control scheme for subsequent executing by the robotic arm for the first item.
    Type: Grant
    Filed: March 8, 2019
    Date of Patent: March 28, 2023
    Assignee: ABB Schweiz AG
    Inventors: Gregory A. Cole, Gregory F. Rossano, Jordi Artigas, Harald Staab, Thomas A. Fuhlbrigge, Carlos Martinez, Sangeun Choi, Jianjun Wang, Xiongzi Li
  • Patent number: 11592829
    Abstract: A control device and a control method can quickly estimate a self-location even when the self-location is unknown. In a case of storing information supplied in a time series detected by LIDAR or a wheel encoder and estimating a self-location by using the stored time-series information, when a position change happens unpredictably in advance such as a kidnap state is detected, the stored time-series information is reset, and then the self-location is estimated again. Example host platforms include a multi-legged robot, a flying object, and an in-vehicle system that autonomously moves in accordance with a mounted computing machine.
    Type: Grant
    Filed: November 21, 2018
    Date of Patent: February 28, 2023
    Assignee: SONY CORPORATION
    Inventors: Dai Kobayashi, Ryo Watanabe
  • Patent number: 11587027
    Abstract: A method of tracking is described. The method may include receiving, from a location device, a location request for an object and identifying a signal from a transponder associated with the object, wherein the transponder comprises at least one or more types of locating technology or sensing technology. The method may also include determining, by a processing device, at least one of the location of the object or information associated with the object based on the identified signal.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: February 21, 2023
    Assignee: GPS of Things, Inc.
    Inventors: Christopher J. Waters, Brent R. Humphrey
  • Patent number: 11577386
    Abstract: A gripping method relates to a method for gripping an object using a multi-fingered hand provided with a plurality of fingers. The method includes measuring, using a three-dimensional measurement sensor, an area that contains the object, and obtaining three-dimensional information for each position within the area, and deciding positions of the plurality of fingers for gripping the object, by classifying the area, if the area includes a measured area for which the three-dimensional information could be obtained and an unmeasured area for which the three-dimensional information could not be obtained, into the measured area and the unmeasured area based on the distance-indicating information, the positions of the plurality of fingers being decided based on positions of the unmeasured area.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: February 14, 2023
    Assignee: OMRON Corporation
    Inventors: Yuki Nishina, Yoshinori Konishi
  • Patent number: 11577388
    Abstract: Apparatus, systems, methods, and articles of manufacture for automatic robot perception programming by imitation learning are disclosed. An example apparatus includes a percept mapper to identify a first percept and a second percept from data gathered from a demonstration of a task and an entropy encoder to calculate a first saliency of the first percept and a second saliency of the second percept. The example apparatus also includes a trajectory mapper to map a trajectory based on the first percept and the second percept, the first percept skewed based on the first saliency, the second percept skewed based on the second saliency. In addition, the example apparatus includes a probabilistic encoder to determine a plurality of variations of the trajectory and create a collection of trajectories including the trajectory and the variations of the trajectory.
    Type: Grant
    Filed: June 27, 2019
    Date of Patent: February 14, 2023
    Assignee: Intel Corporation
    Inventors: David I. Gonzalez Aguirre, Javier Felip Leon, Javier Sebastián Turek, Luis Carlos Maria Remis, Ignacio Javier Alvarez, Justin Gottschlich
  • Patent number: 11517168
    Abstract: Provided is a robot cleaner using an artificial intelligence (AI) algorithm and/or a machine learning algorithm in a 5G environment connected for Internet of Things (IoT). The robot cleaner includes one or more sensors, a driving wheel, a suction blower, and a controller, and the controller defines a cleaning target area, identifies a user's location and a type of the user's behavior, collects life pattern information of the user including the user's location, the type of the user's behavior, and timestamps each associated therewith during the time period of one day or more, determines a cleaning schedule of the robot cleaner based on the collected life pattern information, and controls the driving wheel and the suction blower so as to perform cleaning in accordance with the determined cleaning schedule.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: December 6, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Eu Gene Kim, Kwan Young Son, Hyun Seob Lee
  • Patent number: 11520571
    Abstract: The present system is a software defined manufacturing (SDM) system that integrates several technologies and methods into a system that automates the process of engineering and operating automated manufacturing systems (aka “automating automation”). In one embodiment, some or all of the below aspects of the “automating automation” system are integrated: modular, configurable, reusable manufacturing cells; computer vision systems; autocalibration systems; a recipe-based programming environment; configuration management system; production analytics; and a marketplace for sharing recipes.
    Type: Grant
    Filed: November 12, 2020
    Date of Patent: December 6, 2022
    Assignee: Bright Machines, Inc.
    Inventors: Brian Philip Mathews, Ronald Poelman
  • Patent number: 11504849
    Abstract: The present teaching relates to a method and system for path planning. A target is tracked via one or more sensors. Information of a desired pose of an end-effector with respect to the target and a current pose of the end-effector is obtained. Also, a minimum distance permitted between an arm including the end-effector and each of at least one obstacle identified between the current pose of the end-effector and the target is obtained. A weighting factor previously learned is retrieved and a cost based on a cost function is computed in accordance with a weighted smallest distance between the arm including the end-effector and the at least one obstacle, wherein the smallest distance is weighted by the weighting factor. A trajectory is computed from the current pose to the desired pose by minimizing the cost function.
    Type: Grant
    Filed: November 22, 2019
    Date of Patent: November 22, 2022
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11504846
    Abstract: The present invention relates to a robot teaching system based on image segmentation and surface electromyography and robot teaching method thereof, comprising a RGB-D camera, a surface electromyography sensor, a robot and a computer, wherein the RGB-D camera collects video information of robot teaching scenes and sends to the computer; the surface electromyography sensor acquires surface electromyography signals and inertial acceleration signals of the robot teacher, and sends to the computer; the computer recognizes a articulated arm and a human joint, detects a contact position between the articulated arm and the human joint, and further calculates strength and direction of forces rendered from a human contact position after the human joint contacts the articulated arm, and sends a signal controlling the contacted articulated arm to move along with such a strength and direction of forces and robot teaching is done.
    Type: Grant
    Filed: January 20, 2021
    Date of Patent: November 22, 2022
    Assignee: QINGDAO UNIVERSITY OF TECHNOLOGY
    Inventors: Chengjun Chen, Yong Pan, Dongnian Li, Zhengxu Zhao, Jun Hong
  • Patent number: 11507103
    Abstract: A method of obstacle handling for a mobile automation apparatus includes: obtaining an initial localization of the mobile automation apparatus in a frame of reference; detecting an obstacle by one or more sensors disposed on the mobile automation apparatus; generating and storing an initial location of the obstacle in the frame of reference, based on (i) the initial localization, and (ii) a detected position of the obstacle relative to the mobile automation apparatus; obtaining a correction to the initial localization of the mobile automation apparatus; and applying a positional adjustment, based on the correction, to the initial position of the obstacle to generate and store an updated position of the obstacle.
    Type: Grant
    Filed: December 4, 2019
    Date of Patent: November 22, 2022
    Assignee: Zebra Technologies Corporation
    Inventors: Peter Arandorenko, Sadegh Tajeddin, Zi Cong Guo
  • Patent number: 11498214
    Abstract: A teaching device constructs, in a virtual space, a virtual robot system in which a virtual 3D model of a robot and a virtual 3D model of a peripheral structure of the robot are arranged, and teaches a moving path of the robot. The teaching device includes an acquisition unit configured to acquire information about a geometric error between the virtual 3D models, and a correction unit configured to correct the moving path of the robot in accordance with the information acquired by the acquisition unit.
    Type: Grant
    Filed: January 3, 2020
    Date of Patent: November 15, 2022
    Assignee: Canon Kabushiki Kaisha
    Inventors: Yasusato Fujieda, Hisashi Matsumoto
  • Patent number: 11500391
    Abstract: The present invention relates to a method for positioning on the basis of vision information and a robot implementing the method. The method for positioning on the basis of vision information, according to an embodiment of the present invention, comprises the steps of: generating, by a control unit of a robot, first vision information by using image information of an object sensed by controlling a vision sensing unit of a sensor module of the robot; generating, by the control unit of the robot, a vision-based candidate position by matching the first vision information with second vision information stored in a vision information storage unit of a map storage unit; and generating, by the control unit, the vision-based candidate position as the position information of the robot when there is one vision-based candidate position.
    Type: Grant
    Filed: May 14, 2018
    Date of Patent: November 15, 2022
    Assignee: LG ELECTRONICS INC.
    Inventors: Byungkon Sohn, Jungmin Shim, Kyuchun Choi
  • Patent number: 11449051
    Abstract: A method for managing movements of a fleet of a plurality of autonomous mobile objects capable of communicating with a management server. An autonomous mobile object is associated with a user terminal with which a user is equipped. The management server performs the following steps, for a given autonomous mobile object, referred to as a first object, associated with a first user terminal with which a user is equipped: obtaining information representing a position to be reached by the first object, referred to as first object destination information; determining information representing a movement to be made by the first object, referred to as first object movement information, the determining taking into account of at least the first object destination information; and transmitting the first object movement information to the first object.
    Type: Grant
    Filed: May 9, 2017
    Date of Patent: September 20, 2022
    Assignee: IFOLLOW
    Inventor: Vincent Jacquemart
  • Patent number: 11425888
    Abstract: An animal farm system includes a barn, animal related structures within the barn, such as a feeding alley and/or a milking system, and an autonomous vehicle arranged to perform an animal related action and move about in the barn. The vehicle includes a control unit to move the vehicle about, a position determining system for determining a position of the vehicle in the barn, a sensor system to determine a value of a parameter related to a position of the vehicle with respect to the barn or an object therein, such as the at least one structure therein, and a vehicle communication device. The control unit further is arranged to contain barn map information, and receive motion control and navigation information via the vehicle communication device.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: August 30, 2022
    Assignee: LELY PATENT N.V.
    Inventors: Martinus Cornelis Johannes Buijs, Mauro Brenna
  • Patent number: 11413749
    Abstract: Drive unit of an automation component, in particular a gripping, clamping, changing, linear or pivoting unit, whereby the drive unit includes a drive for driving the movable parts of the automation component and a control unit which controls the drive, whereby the control unit includes at least one computing device, and the drive unit together with the drive, control unit and computing device is arranged in or on a base housing of the automation component.
    Type: Grant
    Filed: May 23, 2018
    Date of Patent: August 16, 2022
    Assignee: SCHUNK GmbH & Co. KG Spann-und Greiftechnik
    Inventors: Ralf Becker, Joern Rastetter, Alexander Kupsch, Michael Ohlheiser
  • Patent number: 11400607
    Abstract: An image processing device includes an acquirer, a face detector, and a change detector. The acquirer acquires images captured in succession by an image capturer. The face detector executes a first detection processing (face detection processing) that detects a certain target (face of a person) from a central area of the acquired image. The change detector executes second detection processing to detect a change over time in a surrounding area other than the central area in the acquired image. The second detection processing has a smaller processing load required per unit pixel in the image than that of the first detection processing.
    Type: Grant
    Filed: September 13, 2019
    Date of Patent: August 2, 2022
    Assignee: CASIO COMPUTER CO., LTD.
    Inventor: Tetsuji Makino
  • Patent number: 11358274
    Abstract: An autonomous mobile robot comprising a body and a display screen coupled to the body and configured to display information. A wheel control system is coupled to the body and configured to move the body in a given direction. At least one of a camera, an onboard UWB device, or a sensor is coupled to the body. A central processing unit is in communication with the wheel control system, the display screen, and the at least one camera, onboard UWB device, or sensor. The central processing unit is configured to adjust the orientation of the display screen relative to a user based on information received from the at least one camera, onboard UWB device, or sensor.
    Type: Grant
    Filed: June 13, 2019
    Date of Patent: June 14, 2022
    Assignee: LINGDONG TECHNOLOGY (BEIJING) CO. LTD
    Inventor: Herrickhui Yaozhang
  • Patent number: 11318606
    Abstract: The invention relates to a method for controlling an automation assembly which has a robot assembly with at least one robot (10) and a detection means assembly with at least one detection means (21-23), said method having the following at least partly automated steps: providing (S10) a first sequence of first ordinate data (q1, q2, dq2/dt, ?1, ?2) assigned to successive first abscissa points (t) on the basis of first training data (q1, q2, ?1, X2); identifying (S20) a first event point (tE) within the first abscissa points of the first sequence; and determining (S30) a first event criterion on the basis of the first sequence and the first event point.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: May 3, 2022
    Assignee: KUKA Deutschland GmbH
    Inventor: Helmuth Radrich
  • Patent number: 11305427
    Abstract: A robot system includes a robot, an image acquisition part, an image prediction part, and an operation controller. The image acquisition part acquires a current image captured by a robot camera arranged to move with the end effector. The image prediction part predicts a next image to be captured by the robot camera based on a teaching image model and the current image. The teaching image model is constructed by a machine learning of teaching image which is predicted to capture by the robot camera while the movable part performs an adjustment operation. The operation controller calculates the command value for operating the movable part so that the image captured by the robot camera approaches the next image, and controls the movable part based on the command value.
    Type: Grant
    Filed: November 28, 2018
    Date of Patent: April 19, 2022
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Hitoshi Hasunuma, Masayuki Enomoto
  • Patent number: 11310417
    Abstract: A vision-based state estimation framework to estimate the state of an underwater remotely operated vehicle (ROV) used for inspection of an underwater structure, for example, a nuclear reactor pressure vessel. The framework employs an external overhead, pan-tilt-zoom (PTZ) camera as the primary sensing modality and incorporates prior knowledge of the geometry of the structure.
    Type: Grant
    Filed: May 3, 2018
    Date of Patent: April 19, 2022
    Assignee: CARNEGIE MELLON UNIVERSITY
    Inventors: Nathan Michael, Tabitha Edith Lee, Curtis Boirum
  • Patent number: 11295534
    Abstract: Systems, methods, and devices for color and texture rendering of a space. A method includes receiving an image comprising an object and receiving an indication of a color. The method includes identifying the object within the image and defining a texture of the object. The method includes selecting a stored texture file stored in a database based on the texture of the object. The method includes merging the stored texture file and the color to generate a color and texture placement that can be implemented in a rendered scene.
    Type: Grant
    Filed: June 12, 2020
    Date of Patent: April 5, 2022
    Inventor: Jeremiah Timberline Barker
  • Patent number: 11285613
    Abstract: The present disclosure provides a robot visual image feature extraction method as well as an apparatus and a robot using the same. The method includes: collecting image data through visual sensor(s) of the robot, and collecting angular velocity data through inertial sensor(s) of the robot; calculating a relative pose between image frames in the image data based on the angular velocity data; extracting feature points of the first image frame in the image data; calculating a projection position of each feature point of the k-th image frame in the k+1-th image frame based on a relative pose between the k-th image frame and the k+1-th image frame; and searching for each feature point in the projection position in the k+1-th image frame, and performing a synchronous positioning and a mapping based on the searched feature point. In this manner, the feature points of dynamic objects are eliminated.
    Type: Grant
    Filed: November 29, 2019
    Date of Patent: March 29, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chenchen Jiang, Youjun Xiong, Longbiao Bai, Simin Zhang, Jianxin Pang
  • Patent number: 11287824
    Abstract: An example autonomous device is configured to move within a space. The autonomous device includes a first system to detect a first location of the autonomous device within the space, with the first location being based on a first fixed reference; a second system to detect a second location of the autonomous device within the space, with the second location being based on a second fixed reference; and a third system to detect a third location of the autonomous device within the space based on relative movements of the autonomous device. One or more processing devices are configured to select one of the first location or the second location based on reliability of at least one of the first location or the second location, and to control movement of the autonomous device using an estimated location that is based on the third location and the selected one of the first location or the second location.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: March 29, 2022
    Assignee: MOBILE INDUSTRIAL ROBOTS A/S
    Inventors: Niels Jul Jacobsen, Søren Eriksen Nielsen
  • Patent number: 11281228
    Abstract: A method and a device for determining a position of a transportation vehicle. A preliminary position of the transportation vehicle is determined, objects are detected in an area surrounding the transportation vehicle wherein the objects have at least visible and virtual building corners, the detected objects are transferred into a local coordinate system and are assigned to a map with objects based on the preliminary position of the transportation vehicle, and the position of the transportation vehicle is determined based on an equalization calculus in which the objects in the map which are most suitable for the objects transferred into the local coordinate system are ascertained.
    Type: Grant
    Filed: June 11, 2019
    Date of Patent: March 22, 2022
    Inventors: Christian Merfels, Moritz Schack
  • Patent number: 11256917
    Abstract: A moving body tracks a target in conjunction with at least one sensing device. The moving body includes a sensor, a communication device, a processor, and a memory in which a computer program is stored. The processor executes the computer program to calculate a position of the target by analyzing an output of the sensor, estimate a region in which the target is present using a last position of the target calculated when sight of the target is lost or a position obtained by analyzing the output of the sensor after losing the sight of the target, and a movement history of the target or its own device until the sight of the target is lost, and instruct the at least one sensing device selected according to the region to receive a result of the search from the sensing device.
    Type: Grant
    Filed: March 13, 2018
    Date of Patent: February 22, 2022
    Assignee: NIDEC CORPORATION
    Inventor: Mitsuhiro Amo
  • Patent number: 11243532
    Abstract: A set of actions corresponding to a particular state of the environment of a vehicle is identified. A respective encoding is generated for different actions of the set, using elements such as distinct colors to distinguish attributes such as target lane segments. Using the encodings as inputs to respective instances of a machine learning model, respective value metrics are estimated for each of the actions. One or more motion-control directives to implement a particular action selected using the value metrics are transmitted to motion-control subsystems of the vehicle.
    Type: Grant
    Filed: September 26, 2018
    Date of Patent: February 8, 2022
    Assignee: Apple Inc.
    Inventors: Martin Levihn, Pekka Tapani Raiko
  • Patent number: 11243501
    Abstract: Vibration of a machine end and an error of a moving trajectory are suppressed. A machine learning device performs machine learning of optimizing first coefficients of a filter provided in a motor controller that controls a motor and second coefficients of a velocity feedforward unit of a servo control unit provided in the motor controller on the basis of an evaluation function which is a function of measurement information after acceleration and deceleration by an external measuring instrument provided outside the motor controller, a position command input to the motor controller, and a position error which is a difference between the position command value and feedback position detection value from a detector of the servo control unit.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: February 8, 2022
    Assignee: FANUC CORPORATION
    Inventors: Ryoutarou Tsuneki, Satoshi Ikai
  • Patent number: 11243540
    Abstract: Methods and apparatus related to autonomous vehicles (AVs) are provided. A mapping can be determined that tiles an environment having an AV using a plurality of cells; each cell having an environmental status. While the AV is in the environment: status data can be received relating to a location of the AV and obstacles at that location; environmental status for a cell can be updated based on the status data; a value for each cell can be determined based on the cell's environmental status; a waypoint of a coverage path that covers a region in the environment and is based on the AV's location can be determined; a determination whether the waypoint is reachable from the AV's location can be made; after determining that the waypoint is reachable, a command based on the mapping can be sent directing the AV toward the waypoint; and the waypoint can be updated.
    Type: Grant
    Filed: May 14, 2019
    Date of Patent: February 8, 2022
    Assignee: University of Connecticut
    Inventors: Shalabh Gupta, Junnan Song
  • Patent number: 11235461
    Abstract: A machine learning device is provided in a versatile controller capable of inferring command data to be issued to each axis of a robot. The device includes an axis angle conversion unit calculating, from the trajectory data, an amount of change of an axis angle of an axis of the robot, a state observation unit observing axis angle data relating to the amount of change of the axis angle of the axis of the robot as a state variable representing a current state of an environment, a label data acquisition unit acquiring axis angle command data relating to command data for the axis of the robot as label data, and a learning unit learning the amount of change of the axis angle of the axis of the robot and the command data for the axis in association with each other by using the state variable and the label data.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: February 1, 2022
    Assignee: Fanuc Corporation
    Inventor: Kouichirou Hayashi
  • Patent number: 11231707
    Abstract: A method of mapping an area to be mowed with an autonomous mowing robot comprises receiving mapping data from a robot lawnmower, the mapping data specifying an area to be mowed and a plurality of locations of beacons positioned within the area to be mowed, and receiving at least first and second geographic coordinates for first and second reference points that are within the area and are specified in the mapping data. The mapping data is aligned to a coordinate system of a map image of the area using the first and second geographic coordinates. The map image is displayed based on aligning the mapping data to the coordinate system.
    Type: Grant
    Filed: April 29, 2019
    Date of Patent: January 25, 2022
    Assignee: iRobot Corporation
    Inventors: Paul C. Balutis, Andrew Beaulieu, Brian Yamauchi, Karl Jeffrey Karlson, Dominic Hugh Jones
  • Patent number: 11226621
    Abstract: Embodiments included herein are directed towards a robotic system and method. Embodiments may include a transportation mechanism having at least three legs and a computing device configured to receive a plurality of optimization components. Each optimization component may include a plurality of variables and the computing device may be further configured to perform a randomized simulation based upon, at least in part, each of the plurality of optimization components. The computing device may be further configured to provide one or more results of the randomized simulation to the transportation mechanism to enable locomotion via the at least three legs.
    Type: Grant
    Filed: February 14, 2019
    Date of Patent: January 18, 2022
    Assignee: TERADYNE, INC.
    Inventors: Ryan S. Penning, James D. English, Douglas E. Barker, Brett L. Limone, Paul Muench
  • Patent number: 11227583
    Abstract: The present disclosure includes customizing responses by an Artificial Intelligence (AI) system using a response mode for interaction with a user. A question or command is received at an AI system from an associated AI device which receives the question or command from an initiating user of a plurality of users in a vicinity of the AI device. A preference of an interaction mode for the initiating user is determined, and the preferred interaction mode is determined using a knowledge corpus. An answer to the question or command using the AI system is generated. Using the AI device, a communication to the initiating user which includes the answer is initiated, via a communication mode based on the interaction mode preference of the initiating user.
    Type: Grant
    Filed: November 5, 2019
    Date of Patent: January 18, 2022
    Assignee: International Business Machines Corporation
    Inventors: Sarbajit K. Rakshit, Christian Compton, Jeremy R. Fox, Trudy L. Hewitt
  • Patent number: 11214437
    Abstract: An autonomous mobile robotic device that may carry and transport one or more items within an environment. The robotic device may comprise a platform on which the one or more items may be placed. The robotic device may pick up, deliver, distribution and/or transport the one or more items to one or more locations. The robotic device may be provided with scheduling information for task execution or for pick up, delivery, distribution and/or transportation of one or more items. Once tasks are complete, the robotic device may autonomously navigate to a storage location.
    Type: Grant
    Filed: September 10, 2018
    Date of Patent: January 4, 2022
    Assignee: AI Incorporated
    Inventor: Ali Ebrahimi Afrouzi
  • Patent number: 11207780
    Abstract: A path planning apparatus is provided with a path planning unit that generates a path of a robot using a plurality of different path planning methods that respectively correspond to a plurality of different constraints determined from the posture of the robot and the characteristics of one or more obstacles that obstruct movement of the robot, an acquisition unit that acquires posture information indicating an initial posture of a robot for which a path is to be generated and a target posture of the robot, and obstacle information indicating a target obstacle that obstructs movement of the robot from the initial posture to the target posture, and a controller that controls the path planning unit so as to generate a path of the robot using a path planning method corresponding to a constraint determined from the posture information and the obstacle information acquired by the acquisition unit.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: December 28, 2021
    Assignee: OMRON Corporation
    Inventors: Toshihiro Moriya, Kennosuke Hayashi, Akane Nakashima, Takeshi Kojima, Haruka Fujii