Patents by Inventor Andrew D. Falendysz
Andrew D. Falendysz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240111303Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 14, 2023Publication date: April 4, 2024Applicant: Tomahawk Robotics, Inc.Inventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Kevin M. MAKOVY, Daniel R. HEDMAN, Bradley D. TRUESDELL
-
Publication number: 20240078763Abstract: Methods and systems are described herein for determining three-dimensional locations of objects within a video stream and linking those objects with known objects. An image processing system may receive an image and image metadata and detect an object and a location of the object within the image. The estimated location of each object is then determined within the three-dimensional space. In addition, the image processing system may retrieve, for a plurality of known objects, a plurality of known locations within the three-dimensional space and determine, based on estimated location and the known location data, which of the known objects matches the detected object in the image. An indicator for the object is then generated at the location of the object within the image.Type: ApplicationFiled: September 7, 2022Publication date: March 7, 2024Applicant: Tomahawk Robotics, Inc.Inventors: Daniel R. HEDMAN, William S. BOWMAN, Matthew D. SUMMER, Bryce KORTE, Andrew D. FALENDYSZ
-
Patent number: 11886182Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: GrantFiled: December 31, 2019Date of Patent: January 30, 2024Assignee: Tomahawk Robotics, Inc.Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20240005801Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: September 19, 2023Publication date: January 4, 2024Applicant: Tomahawk RoboticsInventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Daniel R. HEDMAN, Brad TRUESDELL, Jeffrey S. COOPER, Michael E. BOWMAN, Sean WAGONER, Kevin MAKOVY
-
Patent number: 11854410Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: GrantFiled: January 7, 2022Date of Patent: December 26, 2023Assignee: Tomahawk RoboticsInventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20230394812Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: ApplicationFiled: August 22, 2023Publication date: December 7, 2023Applicant: Tomahawk RoboticsInventors: William S. BOWMAN, Sean WAGONER, Andrew D. FALENDYSZ, Matthew D. SUMMER, Kevin MAKOVY, Jeffrey S. COOPER, Brad TRUESDELL
-
Publication number: 20230333671Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.Type: ApplicationFiled: May 3, 2023Publication date: October 19, 2023Applicant: Tomahawk RoboticsInventors: Michael E. BOWMAN, William S. BOWMAN, Daniel R. HEDMAN, Matthew D. SUMMER, Andrew D. FALENDYSZ
-
Patent number: 11776247Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: GrantFiled: January 7, 2022Date of Patent: October 3, 2023Assignee: Tomahawk RoboticsInventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
-
Publication number: 20230237802Abstract: Methods and systems are described herein for generating composite data streams. A data stream processing system may receive multiple data streams from, for example, multiple unmanned vehicles and determine, based on the type of data within each data stream, a machine learning model for each data stream for processing the type of data. Each machine learning model may receive the frames of a corresponding data stream and output indications and locations of objects within those data streams. The data stream processing system may then generate a composite data stream with indications of the detected objects.Type: ApplicationFiled: March 23, 2022Publication date: July 27, 2023Applicant: Tomahawk RoboticsInventors: Andrew D. Falendysz, William S. Bowman, Matthew D. Summer, Daniel R. Hedman, Sean Wagoner
-
Publication number: 20230222783Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.Type: ApplicationFiled: January 7, 2022Publication date: July 13, 2023Inventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
-
Patent number: 11675445Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.Type: GrantFiled: April 13, 2022Date of Patent: June 13, 2023Assignee: Tomahawk RoboticsInventors: Michael E. Bowman, William S. Bowman, Daniel R. Hedman, Matthew D. Summer, Andrew D. Falendysz, Kevin Makovy, Michael W. Holt
-
Publication number: 20220413490Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: January 7, 2022Publication date: December 29, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20220415184Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.Type: ApplicationFiled: January 7, 2022Publication date: December 29, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
-
Publication number: 20220083054Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 17, 2022Inventors: Matthew D. Summer, William S. Bowrnan, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20220083069Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 17, 2022Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
-
Publication number: 20220075364Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.Type: ApplicationFiled: December 31, 2019Publication date: March 10, 2022Inventors: Matthew D. Summer, William S. Bowrnan, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell