Patents by Inventor Matthew D. Summer
Matthew D. Summer has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240111303Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 14, 2023Publication date: April 4, 2024Applicant: Tomahawk Robotics, Inc.Inventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Kevin M. MAKOVY, Daniel R. HEDMAN, Bradley D. TRUESDELL
-
Publication number: 20240078763Abstract: Methods and systems are described herein for determining three-dimensional locations of objects within a video stream and linking those objects with known objects. An image processing system may receive an image and image metadata and detect an object and a location of the object within the image. The estimated location of each object is then determined within the three-dimensional space. In addition, the image processing system may retrieve, for a plurality of known objects, a plurality of known locations within the three-dimensional space and determine, based on estimated location and the known location data, which of the known objects matches the detected object in the image. An indicator for the object is then generated at the location of the object within the image.Type: ApplicationFiled: September 7, 2022Publication date: March 7, 2024Applicant: Tomahawk Robotics, Inc.Inventors: Daniel R. HEDMAN, William S. BOWMAN, Matthew D. SUMMER, Bryce KORTE, Andrew D. FALENDYSZ
-
Patent number: 11886182Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: GrantFiled: December 31, 2019Date of Patent: January 30, 2024Assignee: Tomahawk Robotics, Inc.Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20240005801Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: September 19, 2023Publication date: January 4, 2024Applicant: Tomahawk RoboticsInventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Daniel R. HEDMAN, Brad TRUESDELL, Jeffrey S. COOPER, Michael E. BOWMAN, Sean WAGONER, Kevin MAKOVY
-
Patent number: 11854410Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: GrantFiled: January 7, 2022Date of Patent: December 26, 2023Assignee: Tomahawk RoboticsInventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20230394812Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: ApplicationFiled: August 22, 2023Publication date: December 7, 2023Applicant: Tomahawk RoboticsInventors: William S. BOWMAN, Sean WAGONER, Andrew D. FALENDYSZ, Matthew D. SUMMER, Kevin MAKOVY, Jeffrey S. COOPER, Brad TRUESDELL
-
Publication number: 20230333671Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.Type: ApplicationFiled: May 3, 2023Publication date: October 19, 2023Applicant: Tomahawk RoboticsInventors: Michael E. BOWMAN, William S. BOWMAN, Daniel R. HEDMAN, Matthew D. SUMMER, Andrew D. FALENDYSZ
-
Patent number: 11776247Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: GrantFiled: January 7, 2022Date of Patent: October 3, 2023Assignee: Tomahawk RoboticsInventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
-
Publication number: 20230237802Abstract: Methods and systems are described herein for generating composite data streams. A data stream processing system may receive multiple data streams from, for example, multiple unmanned vehicles and determine, based on the type of data within each data stream, a machine learning model for each data stream for processing the type of data. Each machine learning model may receive the frames of a corresponding data stream and output indications and locations of objects within those data streams. The data stream processing system may then generate a composite data stream with indications of the detected objects.Type: ApplicationFiled: March 23, 2022Publication date: July 27, 2023Applicant: Tomahawk RoboticsInventors: Andrew D. Falendysz, William S. Bowman, Matthew D. Summer, Daniel R. Hedman, Sean Wagoner
-
Publication number: 20230222783Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: ApplicationFiled: January 7, 2022Publication date: July 13, 2023Inventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
-
Patent number: 11675445Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.Type: GrantFiled: April 13, 2022Date of Patent: June 13, 2023Assignee: Tomahawk RoboticsInventors: Michael E. Bowman, William S. Bowman, Daniel R. Hedman, Matthew D. Summer, Andrew D. Falendysz, Kevin Makovy, Michael W. Holt
-
Publication number: 20220415184Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: January 7, 2022Publication date: December 29, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20220413490Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: January 7, 2022Publication date: December 29, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20220083054Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 17, 2022Inventors: Matthew D. Summer, William S. Bowrnan, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20220083069Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 17, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20220075364Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 10, 2022Inventors: Matthew D. Summer, William S. Bowrnan, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Patent number: 11247737Abstract: Unmanned ground vehicle (UGV) includes a rotary joint having an axis of rotation. A rotary joint actuator is responsive to at least one control signal and is configured to cause a rotatable portion of the rotary joint to rotate relative to the vehicle chassis about the rotary joint axis of rotation. A stabilizer flipper having an elongated length is attached to the rotatable portion. Consequently, rotation of the rotatable portion about the rotary joint axis of rotation results in a change of orientation of the stabilizer flipper relative to the chassis. This change in orientation can range between a lateral direction and an longitudinal direction with respect to the vehicle chassis.Type: GrantFiled: April 23, 2018Date of Patent: February 15, 2022Assignee: EAGLE TECHNOLOGY, LLCInventors: Matthew D. Summer, Paul M. Bosscher, Michael E. Bowman, Sean J. Irvin
-
Patent number: 11207649Abstract: Methods, reactor tubes, reactors, and systems for catalysis are disclosed. A reactor tube includes an outer shell defining a catalyst bed, a catalyst within the catalyst bed, and an inner tube extending through the catalyst bed. An interior of the inner tube is isolated from the catalyst within the catalyst bed. Methods of activating a catalyst are also disclosed herein.Type: GrantFiled: March 13, 2019Date of Patent: December 28, 2021Assignee: West Biofuels, LLCInventors: Matthew B. Hoffman, Matthew D. Summers
-
Patent number: 10955212Abstract: A recoil managed disruptor includes a disruptor device having a barrel from which a slug of material is fired. A piston is mechanically coupled to the disruptor device. A housing which supports the disruptor on a positioning device includes a deformable recoil absorber (DRA) constraint. The DRA constraint is configured to receive a sacrificial DRA structure comprised of a semi-rigid material. The piston is responsive to a recoil force produced when the disruptor device is fired to travel along an axial length of the housing and permanently deform the DRA structure within the DRA constraint.Type: GrantFiled: April 16, 2018Date of Patent: March 23, 2021Assignee: Eagle Technology, LLCInventors: Michael E. Bowman, Matthew D. Summer, Paul M. Bosscher
-
Patent number: 10882191Abstract: Robotic system includes a control system and a slave device which is controlled by the control system. The slave device has a robotic grasping device formed of a rigid base and at least one finger which is movable to facilitate grasping of objects. At least one sensor is provided which senses a force applied to the finger. A cutting tool having a cutting jaw is also attached to the base. The cutting jaw is arranged to pivot on a pivot axis responsive to a pivot motion of the finger. The forces exerted on the cutting jaw are sensed with the sensor during a first predetermined range of finger motion associated with a cutting mode of operation.Type: GrantFiled: July 29, 2019Date of Patent: January 5, 2021Assignee: L3HARRIS TECHNOLOGIES, INC.Inventors: Paul M. Bosscher, Matthew D. Summer, Michael E. Bowman, Nicholas Murphy-DuBay, Loran J. Wilkinson