Patents Assigned to AT ROBOTICS INC.
  • Patent number: 11638120
    Abstract: There is described a system for tracking beacon tags comprising a communication component and a processor. The communication component receives a beacon from a beacon tag and transmit an acknowledgment of the beacon. The processor identifies a tag characteristic associated with the beacon tag and generates a tag instruction based on the tag characteristic. The tag instruction includes a beaconing rate for the beacon tag, and the acknowledgment includes the tag instruction. There is also described a beacon tag for operating with a tracking system comprising a communication component and a processor. The communication component transmits a first beacon, receive an acknowledgment associated with the beacon, and transmit a second beacon from the beacon tag based on an adjusted beaconing rate. The processor identifies a tag instruction of the acknowledgment and adjusts the beaconing rate of the beacon tag based on the tag instruction.
    Type: Grant
    Filed: August 26, 2020
    Date of Patent: April 25, 2023
    Assignee: Building Robotics, Inc.
    Inventor: Tanuj Mohan
  • Patent number: 11636648
    Abstract: Systems and methods for identifying a workpiece in a processing environment may utilize one or more sensors for digitally recording visual information and providing that information to an industrial workflow. The sensor(s) may be positioned to record at least one image of the workpiece at a location where a specified position and orientation thereof is required. A processor may determine, from the recorded image(s) and a stored digital model, whether the workpiece conforms to the specified position and orientation.
    Type: Grant
    Filed: April 26, 2021
    Date of Patent: April 25, 2023
    Assignee: VEO ROBOTICS, INC.
    Inventors: Brad C. Mello, Paul Jakob Schroeder, Scott Denenberg, Clara Vu
  • Patent number: 11634883
    Abstract: This description provides an autonomous or semi-autonomous excavation vehicle that is capable of navigating through a dig site and carrying out an excavation routine using a system of sensors physically mounted to the excavation vehicle. The sensors collects any one or more of spatial, imaging, measurement, and location data representing the status of the excavation vehicle and its surrounding environment. Based on the collected data, the excavation vehicle executes instructions to carry out an excavation routine. The excavation vehicle is also able to carry out numerous other tasks, such as checking the volume of excavated earth in an excavation tool, and helping prepare a digital terrain model of the site as part of a process for creating the excavation routine.
    Type: Grant
    Filed: November 18, 2020
    Date of Patent: April 25, 2023
    Assignee: BUILT ROBOTICS INC.
    Inventors: Noah Austen Ready-Campbell, Andrew Xiao Liang, Linus Page Chou, Edward Stephen Walker, Christian John Wawrzonek, Cyrus McMann Ready-Campbell
  • Patent number: 11634126
    Abstract: A motion planner performs motion planning with collision assessment, using a motion planning lattice that represents configuration states of a primary agent (e.g., autonomous vehicle) as nodes and transitions between states as edges. The system may assign cost values to edges, the cost values representing probability or likelihood of collision for the corresponding transition. The cost values may additionally or alternatively represent a severity of collision, for example generated via a parametric function with two or more parameters and one or more weights. A primary agent and/or dynamic obstacles may be represented as respective oriented bounding boxes. Some obstacles (e.g., road markings, edge of road) may be represented as curves.
    Type: Grant
    Filed: May 26, 2020
    Date of Patent: April 25, 2023
    Assignee: REALTIME ROBOTICS, INC.
    Inventors: William Floyd-Jones, Bryce Willey, George Konidaris, Xianchao Long
  • Patent number: 11633856
    Abstract: A robot may include a spatiotemporal controller for controlling the kinematics or movements of the robot via continuous and/or granular adjustments to the actuators that perform the physical operations of the robot. The spatiotemporal controller may continuously and/or granularly adjust the actuators to align completion or execution of different objectives or waypoints from a spatiotemporal plan within time intervals allotted for each objective by the spatiotemporal plan. The spatiotemporal controller may also continuously and/or granularly adjust the actuators to workaround unexpected conflicts that may arise during the execution of an objective and delays that result from a workaround while still completing the objective within the allotted time interval. By completing objectives within the allotted time intervals, the spatiotemporal controller may ensure that conflicts do not arise as the robots simultaneously operate in the site using some of the same shared resource.
    Type: Grant
    Filed: March 11, 2021
    Date of Patent: April 25, 2023
    Assignee: inVia Robotics, Inc.
    Inventors: Joseph William Dinius, Brandon Pennington, Randolph Charles Voorhies, Lior Elazary, Daniel Frank Parks, II
  • Patent number: 11634888
    Abstract: When an EMV performs an action comprising moving a tool of the EMV through soil or other material, the EMV can measure a current speed of the tool through the material and a current kinematic pressure exerted on the tool by the material. Using the measured current speed and kinematic pressure, the EMV system can use a machine learned model to determine one or more soil parameters of the material. The EMV can then make decisions based on the soil parameters, such as by selecting a tool speed for the EMV based on the determined soil parameters.
    Type: Grant
    Filed: May 3, 2022
    Date of Patent: April 25, 2023
    Assignee: Built Robotics Inc.
    Inventors: Gaurav Jitendra Kikani, Noah Austen Ready-Campbell, Andrew Xiao Liang, Joonhyun Kim
  • Patent number: 11635749
    Abstract: Systems and methods for optimizing factory scheduling, layout or both which represent active factory elements (human and machine) as computational objects and simulate factory operation to optimize a solution. This enables the efficient assembly of customized products, accommodates variable demand, and mitigates unplanned events (floor blockages, machines/IMRs/workcell/workers downtime, variable quantity, location, and destination of supply parts).
    Type: Grant
    Filed: January 3, 2022
    Date of Patent: April 25, 2023
    Assignee: VEO ROBOTICS, INC.
    Inventors: Patrick Sobalvarro, Clara Vu, Joshua Downer, Paulo Ferreira, Mehmet Ali Guney, Thomas C. Ferree, Alberto Moel, Richard A. Kelsey
  • Publication number: 20230123736
    Abstract: In one example, a method includes combining multi-sensor inspection (MIS) data from sensors of inspection platform(s); using respective metadata to select one or more tools for translating the MSI data into a common file format; applying the one or more tools to the first and second MSI data to obtain respective common data formatted files; and providing, via a cloud computing device, access to the common data formatted files. Other implementations may be described and claimed.
    Type: Application
    Filed: October 13, 2022
    Publication date: April 20, 2023
    Applicant: RedZone Robotics, Inc.
    Inventors: Tim Renton, Foster J Salotti, Jason Mizgorski, Anthony van Iersel, Cody Steliga, Bill Derkson, Lynn Palmieri, Dimitrije Balanovic
  • Patent number: 11632263
    Abstract: A method comprising: accessing a response mapping defining a set of safety-critical functions associated with a safety-critical latency threshold and a set of safety responses, each safety response corresponding to a safety-critical function; executing a time-synchronization protocol with a transmitting system to calculate a clock reference; accessing a safety message schedule indicating an expected arrival time for each safety message in a series of safety messages based on the clock reference; for each safety message in the series of safety messages, calculating a latency of the safety message based on an arrival time of the safety message and the expected arrival time; and in response to a latency of a current safety message in the series of safety messages exceeding the safety-critical latency threshold, initiating the safety response corresponding to the safety-critical function for each safety-critical function in the set of safety-critical functions.
    Type: Grant
    Filed: November 30, 2021
    Date of Patent: April 18, 2023
    Assignee: Fort Robotics, Inc.
    Inventor: Nathan Bivans
  • Patent number: 11623356
    Abstract: Systems and methods for determining safe and unsafe zones in a workspace—where safe actions are calculated in real time based on all relevant objects (e.g., some observed by sensors and others computationally generated based on analysis of the sensed workspace) and on the current state of the machinery (e.g., a robot) in the workspace—may utilize a variety of workspace-monitoring approaches as well as dynamic modeling of the robot geometry. The future trajectory of the robot(s) and/or the human(s) may be forecast using, e.g., a model of human movement and other forms of control. Modeling and forecasting of the robot may, in some embodiments, make use of data provided by the robot controller that may or may not include safety guarantees.
    Type: Grant
    Filed: April 4, 2022
    Date of Patent: April 11, 2023
    Assignee: VEO ROBOTICS, INC.
    Inventors: Clara Vu, Scott Denenberg, Abraham K. Feldman
  • Patent number: 11623346
    Abstract: Solutions for multi-robot configurations are co-optimized, to at least some degree, across a set of non-homogenous parameters based on a given set of tasks to be performed by robots in a multi-robot operational environment. Non-homogenous parameters may include two or more of: the respective base position and orientation of the robots, an allocation of tasks to respective robots, respective target sequences and/or trajectories for the robots. Such may be executed pre-runtime. Output may include for each robot: workcell layout, an ordered list or vector of targets, optionally dwell time durations at respective targets, and paths or trajectories between each pair of consecutive targets. Output may provide a complete, executable, solution to the problem, which in the absence of variability in timing, can be used to control the robots without any modification. A genetic algorithm, e.g., Differential Evolution, may optionally be used in generating a population of candidate solutions.
    Type: Grant
    Filed: January 20, 2021
    Date of Patent: April 11, 2023
    Assignee: REALTIME ROBOTICS, INC.
    Inventors: Luca Colasanto, Sean Murray
  • Patent number: 11623305
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system uses a treatment unit for emitting a laser at agricultural objects. The treatment unit is configured with a treatment head assembly that includes a moveable treatment head with one or more laser emitting tips. A first and second motor assembly are operated by the treatment unit to control the movement of the treatment head. The first motor assembly includes a first motor rotatable in a first rotational axis. A first linkage assembly is connected to the first motor and the treatment head assembly. The first linkage assembly is rotatable by the first motor. The second linkage assembly is rotatable by the second motor.
    Type: Grant
    Filed: October 15, 2021
    Date of Patent: April 11, 2023
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Patrick Christopher Leger, Wisit Jirattigalochote, Curtis Dale Garner, Liang-shian Chen, Jacob R. Stelzriede
  • Patent number: 11625915
    Abstract: Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
    Type: Grant
    Filed: March 29, 2022
    Date of Patent: April 11, 2023
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Curtis Dale Garner, Andre Robert Daniel Michelin, Lorenzo Ibarria, Patrick Christopher Leger, Benjamin Rewis, Shi Yan
  • Publication number: 20230108602
    Abstract: A prewashing system includes a dish recognizing unit that recognizes a dish that is an object of prewashing executed prior to main washing, a first prewashing unit that performs prewashing of the dish that is the object of prewashing by a flow of water, a second prewashing unit that performs prewashing of the dish that is the object of prewashing by a washing tool, a prewashing method deciding unit that decides a prewashing method using at least one of the first prewashing unit and the second prewashing unit, on the basis of a form of the dish that is the object of prewashing, which is recognized by the dish recognizing unit, and a prewashing executing unit that executes prewashing of the dish that is the object of prewashing, by the prewashing method decided by the prewashing method deciding unit.
    Type: Application
    Filed: September 29, 2022
    Publication date: April 6, 2023
    Applicant: CONNECTED ROBOTICS, INC.
    Inventors: Tetsuya SAWANOBORI, Koichi TSUKAMOTO
  • Patent number: 11618155
    Abstract: An automated kitchen assistant system inspects a food preparation area in the kitchen environment using a novel sensor combination. The combination of sensors includes an Infrared (IR) camera that generates IR image data and at least one secondary sensor that generates secondary image data. The IR image data and secondary image data are processed to obtain combined image data. A trained convolutional neural network is employed to automatically compute an output based on the combined image data. The output includes information about the identity and the location of the food item. The output may further be utilized to command a robotic arm, kitchen worker, or otherwise assist in food preparation. Related methods are also described.
    Type: Grant
    Filed: February 10, 2021
    Date of Patent: April 4, 2023
    Assignee: Miso Robotics, Inc.
    Inventors: Ryan W. Sinnet, Robert Anderson, Zachary Zweig Vinegar, William Werst, David Zito, Sean Olson
  • Patent number: 11612537
    Abstract: Examples of a device for guiding and detecting a motion of a target joints and a motion assistance system such motion guiding devices are described. The motion guiding and detecting device comprises a motion generator and a motion transfer and target interfacing unit to transfer the motion generated by the motion generator to the target joint. The system further includes a motion detection and feedback unit that interfaces with the target, and a controller that interfaces with both the feedback unit and the motion generator to control and coordinate the motion of the motion generator and the target joint.
    Type: Grant
    Filed: August 20, 2021
    Date of Patent: March 28, 2023
    Assignee: Human In Motion Robotics Inc.
    Inventors: Siamak Arzanpour, Soheil Sadeqi, Shaun Bourgeois, Jung Wook Park
  • Patent number: 11613014
    Abstract: One variation of a method for autonomously scanning and processing a part includes: collecting a set of images depicting a part positioned within a work zone adjacent a robotic system; assembling the set of images into a part model representing the part. The method includes segmenting areas of the part model—delineated by local radii of curvature, edges, or color boundaries—into target zones for processing by the robotic system and exclusion zones avoided by the robotic system. The method includes: projecting a set of keypoints onto the target zone of part model defining positions, orientations, and target forces of a sanding head applied at locations on the part model; assembling the set of keypoints into a toolpath and projecting the toolpath onto the target zone of the part model; and transmitting the toolpath to a robotic system to execute the toolpath on the part within the work zone.
    Type: Grant
    Filed: May 31, 2022
    Date of Patent: March 28, 2023
    Assignee: GrayMatter Robotics Inc.
    Inventors: Avadhoot L. Ahire, Rishav Guha, Satyandra K. Gupta, Ariyan M. Kabir, Brual C Shah
  • Patent number: 11613017
    Abstract: Safety systems in distributed factory workcells intercommunicate or communicate with a central controller so that when a person, robot or vehicle passes from one workcell or space into another on the same factory floor, the new workcell or space need not repeat the tasks of analysis and classification and can instead immediately integrate the new entrant into the existing workcell or space-monitoring schema. The workcell or space can also communicate attributes such as occlusions, unsafe areas, movement speed, and object trajectories, enabling rapid reaction by the monitoring system of the new workcell or space.
    Type: Grant
    Filed: July 14, 2021
    Date of Patent: March 28, 2023
    Assignee: VEO ROBOTICS, INC.
    Inventors: Scott Denenberg, Patrick Sobalvarro, Clara Vu, Alberto Moel, Richard A. Kelsey
  • Publication number: 20230092975
    Abstract: The technology is directed to providing pick and place instructions to a robot. Sensor data including an image feed of a picking container in which at least one product is located may be output. An input indicating a selected product including at least one of the products located in the picking container may be received. A representation of the selected product and at least one image of the order container may be output for display. The representation of the product may be scaled relative to the at least one image of the order container. A place input corresponding to the position of the representation of the product at a packing location within the at least one image of the order container may be received and transmitted to a robot control system.
    Type: Application
    Filed: September 21, 2021
    Publication date: March 23, 2023
    Applicant: Nimble Robotics, Inc.
    Inventors: Simon Kalouche, Aditya Agarwal, Gen Xu, Aleksi Hämäläinen, Rohan Tiwari, Siva Mynepalli, Suyash Nigam
  • Patent number: D985624
    Type: Grant
    Filed: August 31, 2022
    Date of Patent: May 9, 2023
    Assignee: SCYTHE ROBOTICS, INC.
    Inventors: Robert Johnstone McCutcheon, IV, John Gordon Morrison, Isaac Heath Roberts, Kevin Peter McGlade, Davis Thorp Foster, Matthew Alexander Kaplan, Zachary Austin Goins, Matthew G. Quick