Patents Assigned to AT ROBOTICS INC.
  • Publication number: 20240111303
    Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.
    Type: Application
    Filed: December 14, 2023
    Publication date: April 4, 2024
    Applicant: Tomahawk Robotics, Inc.
    Inventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Kevin M. MAKOVY, Daniel R. HEDMAN, Bradley D. TRUESDELL
  • Patent number: 11945443
    Abstract: Methods and systems are provided for traction detection and control of a self-driving vehicle. The self-driving vehicle has drive motors that drive drive-wheels according to a drive-motor speed. Traction detection and control can be obtained by measuring the vehicle speed with a sensor such as a LiDAR or video camera, and measuring the wheel speed of the drive wheels with a sensor such as a rotary encoder. The difference between the measured vehicle speed and the measured wheel speeds can be used to determine if a loss of traction has occurred in any of the wheels. If a loss of traction is detected, then a recovery strategy can be selected from a list of recovery strategies in order to reduce the effects of the loss of traction.
    Type: Grant
    Filed: July 14, 2021
    Date of Patent: April 2, 2024
    Assignee: CLEARPATH ROBOTICS INC.
    Inventors: Ryan Christopher Gariepy, Shahab Kaynama
  • Patent number: 11945119
    Abstract: Crosstalk mitigation among cameras in neighboring monitored workcells is achieved by computationally defining a noninterference scheme that respects the independent monitoring and operation of each workcell. The scheme may involve communication between adjacent cells to adjudicate non-interfering camera operation or system-wide mapping of interference risks and mitigation thereof. Mitigation strategies can involve time-division and/or frequency-division multiplexing.
    Type: Grant
    Filed: May 3, 2023
    Date of Patent: April 2, 2024
    Assignee: Veo Robotics, Inc.
    Inventors: Scott Denenberg, Clara Vu, Gene Malkin, Lev Persits, Valentina Chamorro, Marek Wartenberg, Pratik Devendra Dalvi, Alberto Moel
  • Patent number: 11945102
    Abstract: An end effector device and system for suction-based grasping of bagged objects that can include a body structure with a vacuum line opening and an object engagement region, the vacuum line opening being configured to couple at least one pressure line of a vacuum pressure system to a defined internal channel of the body structure; the body structure comprising an internal structure that defines a concave inner chamber with a chamber opening at the object engagement region; and the internal structure comprising an array of inlets positioned along at least one wall of the concave inner chamber, wherein each inlet defines an opening in the body to the defined internal channel.
    Type: Grant
    Filed: April 1, 2021
    Date of Patent: April 2, 2024
    Assignee: Ambi Robotics, Inc.
    Inventors: David Gealy, Stephen McKinley, Jeffrey Mahler
  • Patent number: 11947331
    Abstract: Systems and methods for safety-enabled control. Input values provided to a control system can be validated. Command gating can be performed for control values provided by the control system. Validation of input values and command gating for control values can be performed in accordance with respective validation windows. Validation windows can be dynamically adjusted based on data received via a sensor or interface.
    Type: Grant
    Filed: February 22, 2023
    Date of Patent: April 2, 2024
    Assignee: Fort Robotics, Inc.
    Inventor: Nathan Bivans
  • Patent number: 11949989
    Abstract: One embodiment provides a system, including: an inspection platform configured to move through underground infrastructure; an imaging device coupled to the inspection platform; the imaging device comprising a camera housing that arranges an array of four or more cameras in a predetermined configuration; the camera housing comprising a plurality of apertures, wherein each aperture houses a respective camera therein with a viewing axis offset about 30 degrees to about 120 degrees from a viewing axis of an adjacent camera within the array; and circuitry that operates the imaging device to capture a plurality of images using the four or more cameras; where the circuitry captures the plurality of images for a composite image of an interior region of the underground infrastructure, and where the region is larger than a single viewing field of any of the four or more cameras. Other embodiments are described and claimed.
    Type: Grant
    Filed: September 28, 2018
    Date of Patent: April 2, 2024
    Assignee: RedZone Robotics, Inc.
    Inventors: Justin Starr, Galin Konakchiev, Foster J Salotti, Todd Kueny, Thorin Tobiassen, Nate Alford, Mark Jordan
  • Patent number: 11945101
    Abstract: An end effector device and system for suction-based grasping of bagged objects that can include a body structure with a vacuum line opening and an object engagement region, the vacuum line opening being configured to couple at least one pressure line of a vacuum pressure system to a defined internal channel of the body structure; the body structure comprising an internal structure that defines a concave inner chamber with a chamber opening at the object engagement region; and the internal structure comprising an array of inlets positioned along at least one wall of the concave inner chamber, wherein each inlet defines an opening in the body to the defined internal channel. The body structure may additionally include a suction cup system that comprises a flexible sealing lip at the object engagement region; wherein the chamber opening being positioned within a grasping region of the sealing lip.
    Type: Grant
    Filed: April 1, 2021
    Date of Patent: April 2, 2024
    Assignee: Ambi Robotics, Inc.
    Inventors: David Gealy, Stephen McKinley, Jeffrey Mahler
  • Publication number: 20240104939
    Abstract: A method comprises identifying a set of image data captured by at least one autonomous vehicle when the at least one autonomous vehicle was positioned in a lane of a roadway, and respective ground truth localization data of the at least one autonomous vehicle; determining a total number of lanes for the roadway; labeling the set of image data with the total number of lanes for the roadway; and training using the labeled set of image data, a machine learning model, such that the machine learning model is configured to predict a new total number of lanes for a new roadway as output.
    Type: Application
    Filed: June 22, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventor: Joseph STAMENKOVICH
  • Publication number: 20240101110
    Abstract: Disclosed herein are systems and methods for operating a vehicle. In an embodiment, a system can identify a maximum distance bound based on one or more objects around a vehicle; for each of a plurality of candidate trajectories for the vehicle, determine a velocity from the maximum distance bound at an ending time of the candidate trajectory; determine an available distance for the candidate trajectory as a function of the determined velocity at the ending time of the candidate trajectory and a comfort deceleration parameter; determine a target velocity for the candidate trajectory; and determine a velocity difference between the target velocity and a final velocity of the candidate trajectory at the ending time of the candidate trajectory; select a first candidate trajectory based on the velocity difference; and operate the vehicle based on the selected first candidate trajectory.
    Type: Application
    Filed: February 16, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventor: Rikki Valverde
  • Publication number: 20240104757
    Abstract: A method, comprises identifying a set of image data captured by at least one autonomous vehicle when the at least one autonomous vehicle was positioned in a lane of a roadway, and respective ground truth localization data of the at least one autonomous vehicle; determining a plurality of lane width values for the set of image data; labeling the set of image data with the plurality of lane width values, the plurality of lane width values representing a width of a lane in which the at least one autonomous vehicle was positioned; and training using the labeled set of image data, a machine learning model, such that the machine learning model is configured to predict a new lane width value for a new lane as output.
    Type: Application
    Filed: June 22, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventor: Joseph STAMENKOVICH
  • Publication number: 20240104938
    Abstract: Systems and methods for training and executing machine learning models to generate lane index values are disclosed. A method includes identifying a set of image data captured by at least one autonomous vehicle when the at least autonomous vehicle is positioned in a lane of a roadway and respective ground truth localization data; determining a plurality of lane index values for the set of image data based on the ground truth localization data; labeling the set of image data with the plurality of lane index values, the lane index values representing a number of lanes from a leftmost or rightmost lane to the lane in which the at least one autonomous vehicle was positioned; and training, using the labeled set of image data, a plurality of machine learning models that generate a left lane index value and a right lane index value as output.
    Type: Application
    Filed: April 19, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventors: Ryan CHILTON, Harish PULLAGURLA, Joseph STAMENKOVICH
  • Publication number: 20240101146
    Abstract: Systems and methods for training and executing machine learning models to generate lane index values are disclosed. A method includes identifying a set of image data captured by at least one autonomous vehicle when the at least autonomous vehicle is positioned in a lane of a roadway and respective ground truth localization data; determining a plurality of lane index values for the set of image data based on the ground truth localization data; labeling the set of image data with the plurality of lane index values, the lane index values representing a number of lanes from a leftmost or rightmost lane to the lane in which the at least one autonomous vehicle was positioned; and training, using the labeled set of image data, a plurality of machine learning models that generate a left lane index value and a right lane index value as output.
    Type: Application
    Filed: April 19, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventors: Ryan CHILTON, Harish PULLAGURLA, Joseph STAMENKOVICH
  • Publication number: 20240103177
    Abstract: A method of detecting an object in a path of an vehicle using a LiDAR system includes emitting a LiDAR signal with the LiDAR system; receiving the LiDAR signal with the LiDAR system; determining a glancing angle distance; determining the LiDAR signal is received from beyond the glancing angle distance based on receipt of the LiDAR signal; and classifying the LiDAR signal as a return from an object based at least in part on the LiDAR signal coming from a distance beyond the glancing angle distance.
    Type: Application
    Filed: February 23, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventors: Sebastian DINGLER, Jordan STONE
  • Publication number: 20240101147
    Abstract: Systems and methods for training and executing machine learning models to generate lane index values are disclosed. A method includes identifying a set of image data captured by at least one autonomous vehicle when the at least autonomous vehicle is positioned in a lane of a roadway and respective ground truth localization data; determining a plurality of lane index values for the set of image data based on the ground truth localization data; labeling the set of image data with the plurality of lane index values, the lane index values representing a number of lanes from a leftmost or rightmost lane to the lane in which the at least one autonomous vehicle was positioned; and training, using the labeled set of image data, a plurality of machine learning models that generate a left lane index value and a right lane index value as output.
    Type: Application
    Filed: April 19, 2023
    Publication date: March 28, 2024
    Applicant: TORC Robotics, Inc.
    Inventors: Ryan CHILTON, Harish PULLAGURLA, Joseph STAMENKOVICH
  • Patent number: 11938632
    Abstract: A method includes: compiling lower-resolution images, captured during a global scan cycle executed over a workpiece, into a virtual model; defining a nominal toolpath and a nominal target force for the workpiece based on a the virtual model; detecting a defect indicator on the workpiece based on the lower-resolution images; accessing a higher-resolution image captured during a local scan cycle over the defect indicator; characterizing the defect indicator as a defect reparable via material removal based on the higher-resolution image; defining a repair toolpath for the defect based on the virtual model; navigating a sanding head over the workpiece according to the repair toolpath to repair the defect; and, during a processing cycle: navigating the sanding head across the workpiece according to the nominal toolpath and deviating the sanding head from the nominal toolpath to maintain forces of the sanding head on the workpiece proximal the nominal target force.
    Type: Grant
    Filed: May 2, 2023
    Date of Patent: March 26, 2024
    Assignee: GrayMatter Robotics Inc.
    Inventors: Avadhoot Ahire, YiWei Chen, Rishav Guha, Satyandra K. Gupta, Ariyan M. Kabir, Ashish Kulkarni, Caesar Navarro, Sagar Panchal, Brual C. Shah
  • Patent number: 11940869
    Abstract: A safety module having a plurality of microcontrollers receives an analog input and determines a value of the analog input. The microcontrollers each determine a respective ternary state of the device by identifying, from three candidate ranges of values, a range of values in which the value falls, wherein at least two of the plurality of microcontrollers uses different candidate ranges of values, determining, based on the identified range, a ternary state corresponding to the range, and assigning the determined ternary state as the respective ternary state. The safety module determines whether the ternary states from the two microcontrollers map to a fault state, and, where they do, cause a command a command to be output to the device to enter a safe state.
    Type: Grant
    Filed: October 17, 2022
    Date of Patent: March 26, 2024
    Assignee: Fort Robotics, Inc.
    Inventor: Kerfegar Khurshed Katrak
  • Patent number: 11938240
    Abstract: An autonomous, mobile robotic device (AMR) is configured with one or more UVC radiation sources, and operates to traverse a path while disinfecting an interior space. Each UVC radiation source is connected to the AMR by an articulating arm that is controlled to orient each source towards a feature or surface that is selected for disinfection during the time that the AMR is moving through the space. The location of each feature selected for disinfection can be mapped, and this map information, a current AMR location and pose can be used to generate signals that are used to control the articulating arm to orient each UVC lamp towards a feature that is selected for disinfection.
    Type: Grant
    Filed: November 6, 2021
    Date of Patent: March 26, 2024
    Assignee: AVA ROBOTICS INC.
    Inventors: Alyssa Pierson, Saman Amarasinghe, Daniela Rus, Marcio Macedo, Youssef Saleh
  • Patent number: 11941799
    Abstract: Data is received that includes a feed of images of a plurality of objects passing in front of an inspection camera module forming part of a quality assurance inspection system. Within each image, it is detected whether an object is present within the image. Instance identifiers are assigned to each object. A single image is identified in which the object is optimally represented for each object using the corresponding instance identifier. These identified images are provided to a consuming application or process for quality assurance analysis.
    Type: Grant
    Filed: October 26, 2021
    Date of Patent: March 26, 2024
    Assignee: Elementary Robotics, Inc.
    Inventors: Dat Do, Arye Barnehama
  • Patent number: 11938932
    Abstract: Systems and methods for self-driving collision prevention are presented. The system comprises a self-driving vehicle safety system, having one or more sensors in communication with a control system. The control system is configured determine safety fields and instruct the sensors to scan a region corresponding to the safety fields. The control system determines exclusion regions, and omits the exclusion regions from the safety field. The safety system may also include capability reduction parameters that can be used to constrain the drive system of the vehicle, for example, by restricting turning radius and speed in accordance with the safety fields.
    Type: Grant
    Filed: June 15, 2022
    Date of Patent: March 26, 2024
    Assignee: CLEARPATH ROBOTICS INC.
    Inventors: Matthew Lord, Ryan Christopher Gariepy, Peiyi Chen, Michael Irvine, Alex Bencz
  • Patent number: 11937524
    Abstract: A method includes obtaining, by the treatment system configured to implement a machine learning (ML) algorithm, one or more images of a region of an agricultural environment near the treatment system, wherein the one or more images are captured from the region of a real-world where agricultural target objects are expected to be present, determining one or more parameters for use with the ML algorithm, wherein at least one of the one or more parameters is based on one or more ML models related to identification of an agricultural object, determining a real-world target in the one or more images using the ML algorithm, wherein the ML algorithm is at least partly implemented using the one or more processors of the treatment system, and applying a treatment to the target by selectively activating the treatment mechanism based on a result of the determining the target.
    Type: Grant
    Filed: September 15, 2022
    Date of Patent: March 26, 2024
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Dustin James Webb