Patents by Inventor William S. Bowman

William S. Bowman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240111303
    Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.
    Type: Application
    Filed: December 14, 2023
    Publication date: April 4, 2024
    Applicant: Tomahawk Robotics, Inc.
    Inventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Kevin M. MAKOVY, Daniel R. HEDMAN, Bradley D. TRUESDELL
  • Publication number: 20240078763
    Abstract: Methods and systems are described herein for determining three-dimensional locations of objects within a video stream and linking those objects with known objects. An image processing system may receive an image and image metadata and detect an object and a location of the object within the image. The estimated location of each object is then determined within the three-dimensional space. In addition, the image processing system may retrieve, for a plurality of known objects, a plurality of known locations within the three-dimensional space and determine, based on estimated location and the known location data, which of the known objects matches the detected object in the image. An indicator for the object is then generated at the location of the object within the image.
    Type: Application
    Filed: September 7, 2022
    Publication date: March 7, 2024
    Applicant: Tomahawk Robotics, Inc.
    Inventors: Daniel R. HEDMAN, William S. BOWMAN, Matthew D. SUMMER, Bryce KORTE, Andrew D. FALENDYSZ
  • Patent number: 11886182
    Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.
    Type: Grant
    Filed: December 31, 2019
    Date of Patent: January 30, 2024
    Assignee: Tomahawk Robotics, Inc.
    Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
  • Publication number: 20240005801
    Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.
    Type: Application
    Filed: September 19, 2023
    Publication date: January 4, 2024
    Applicant: Tomahawk Robotics
    Inventors: Matthew D. SUMMER, William S. BOWMAN, Andrew D. FALENDYSZ, Daniel R. HEDMAN, Brad TRUESDELL, Jeffrey S. COOPER, Michael E. BOWMAN, Sean WAGONER, Kevin MAKOVY
  • Patent number: 11854410
    Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.
    Type: Grant
    Filed: January 7, 2022
    Date of Patent: December 26, 2023
    Assignee: Tomahawk Robotics
    Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
  • Publication number: 20230394812
    Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.
    Type: Application
    Filed: August 22, 2023
    Publication date: December 7, 2023
    Applicant: Tomahawk Robotics
    Inventors: William S. BOWMAN, Sean WAGONER, Andrew D. FALENDYSZ, Matthew D. SUMMER, Kevin MAKOVY, Jeffrey S. COOPER, Brad TRUESDELL
  • Publication number: 20230333671
    Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.
    Type: Application
    Filed: May 3, 2023
    Publication date: October 19, 2023
    Applicant: Tomahawk Robotics
    Inventors: Michael E. BOWMAN, William S. BOWMAN, Daniel R. HEDMAN, Matthew D. SUMMER, Andrew D. FALENDYSZ
  • Patent number: 11776247
    Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.
    Type: Grant
    Filed: January 7, 2022
    Date of Patent: October 3, 2023
    Assignee: Tomahawk Robotics
    Inventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
  • Publication number: 20230237802
    Abstract: Methods and systems are described herein for generating composite data streams. A data stream processing system may receive multiple data streams from, for example, multiple unmanned vehicles and determine, based on the type of data within each data stream, a machine learning model for each data stream for processing the type of data. Each machine learning model may receive the frames of a corresponding data stream and output indications and locations of objects within those data streams. The data stream processing system may then generate a composite data stream with indications of the detected objects.
    Type: Application
    Filed: March 23, 2022
    Publication date: July 27, 2023
    Applicant: Tomahawk Robotics
    Inventors: Andrew D. Falendysz, William S. Bowman, Matthew D. Summer, Daniel R. Hedman, Sean Wagoner
  • Publication number: 20230222783
    Abstract: Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.
    Type: Application
    Filed: January 7, 2022
    Publication date: July 13, 2023
    Inventors: William S. Bowman, Sean Wagoner, Andrew D. Falendysz, Matthew D. Summer, Kevin Makovy, Jeffrey S. Cooper, Brad Truesdell
  • Patent number: 11675445
    Abstract: Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.
    Type: Grant
    Filed: April 13, 2022
    Date of Patent: June 13, 2023
    Assignee: Tomahawk Robotics
    Inventors: Michael E. Bowman, William S. Bowman, Daniel R. Hedman, Matthew D. Summer, Andrew D. Falendysz, Kevin Makovy, Michael W. Holt
  • Publication number: 20220415184
    Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.
    Type: Application
    Filed: January 7, 2022
    Publication date: December 29, 2022
    Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
  • Publication number: 20220413490
    Abstract: A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.
    Type: Application
    Filed: January 7, 2022
    Publication date: December 29, 2022
    Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Daniel R. Hedman, Brad Truesdell, Jeffrey S. Cooper, Michael E. Bowman, Sean Wagoner, Kevin Makovy
  • Publication number: 20220083069
    Abstract: Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.
    Type: Application
    Filed: December 31, 2019
    Publication date: March 17, 2022
    Inventors: Matthew D. Summer, William S. Bowman, Andrew D. Falendysz, Kevin M. Makovy, Daniel R. Hedman, Bradley D. Truesdell
  • Patent number: 10513412
    Abstract: Systems (100) and methods (1400) for operating a Spool Mechanism (“SM”). The methods comprise: transitioning an operational mode of SM from a first operational mode in which a drag torque is not settable to a second operational mode in which the drag torque is settable; selectively mechanically coupling a rewind motor to a spool (612) of SM by engaging a coupler (1014) in response to the SM's transition into the second operational mode; activating the rewind motor (610) such that the rewind motor applies a motor torque having a pre-defined value selected for facilitating a setting of the drag torque to an optimal value; mechanically gradually adjusting an amount of drag resistance applied to the spool by a drag mechanism (1012); and discontinuing the mechanical adjustment of the drag resistance when the spool's speed is within a threshold percentage range of a zero resistance speed.
    Type: Grant
    Filed: August 3, 2017
    Date of Patent: December 24, 2019
    Assignee: Harris Corporation
    Inventors: Matthew D. Summer, Paul M. Bosscher, Michael E. Bowman, William S. Bowman
  • Publication number: 20190039856
    Abstract: Systems (100) and methods (1400) for operating a Spool Mechanism (“SM”). The methods comprise: transitioning an operational mode of SM from a first operational mode in which a drag torque is not settable to a second operational mode in which the drag torque is settable; selectively mechanically coupling a rewind motor to a spool (612) of SM by engaging a coupler (1014) in response to the SM's transition into the second operational mode; activating the rewind motor (610) such that the rewind motor applies a motor torque having a pre-defined value selected for facilitating a setting of the drag torque to an optimal value; mechanically gradually adjusting an amount of drag resistance applied to the spool by a drag mechanism (1012); and discontinuing the mechanical adjustment of the drag resistance when the spool's speed is within a threshold percentage range of a zero resistance speed.
    Type: Application
    Filed: August 3, 2017
    Publication date: February 7, 2019
    Inventors: Matthew D. Summer, Paul M. Bosscher, Michael E. Bowman, William S. Bowman
  • Patent number: 9936133
    Abstract: A system for automatically controlling a gimbaled camera system of a vehicle. The system includes a camera positioned relative to a body of the vehicle and one or more sensors configured to sense the pointing direction of the camera. One or more sensors are configured to monitor movement of the vehicle relative to a surface. A processor is configured to receive the sensed camera pointing direction data and vehicle movement data. The processor establishes and stores a target position representative of the position of a target object relative to the vehicle body based on an object independent association and automatically adjusts the camera pointing direction in response to the vehicle movement data such that the camera remains aimed on the target position. A method for automatically controlling the gimbaled camera system is also provided.
    Type: Grant
    Filed: August 19, 2015
    Date of Patent: April 3, 2018
    Assignee: Harris Corporation
    Inventors: Paul M. Bosscher, Matthew D. Summer, William S. Bowman, Jeffrey M. Pollard
  • Publication number: 20170050563
    Abstract: A system for automatically controlling a gimbaled camera system of a vehicle. The system includes a camera positioned relative to a body of the vehicle and one or more sensors configured to sense the pointing direction of the camera. One or more sensors are configured to monitor movement of the vehicle relative to a surface. A processor is configured to receive the sensed camera pointing direction data and vehicle movement data. The processor establishes and stores a target position representative of the position of a target object relative to the vehicle body based on an object independent association and automatically adjusts the camera pointing direction in response to the vehicle movement data such that the camera remains aimed on the target position. A method for automatically controlling the gimbaled camera system is also provided.
    Type: Application
    Filed: August 19, 2015
    Publication date: February 23, 2017
    Inventors: Paul M. Bosscher, Matthew D. Summer, William S. Bowman, Jeffrey M. Pollard
  • Patent number: 9002517
    Abstract: Method and system for telematic control of a slave device. Displacement of a user interface control is sensed with respect to a control direction. A first directional translation is performed to convert data specifying the control direction to data specifying a slave direction. The slave direction will generally be different from the control direction and defines a direction that the slave device should move in response to the physical displacement of the user interface. A second directional translation is performed to convert data specifying haptic sensor data to a haptic feedback direction. The haptic feedback direction will generally be different from the sensed direction and can define a direction of force to be generated by at least one component of the user interface. The first and second directional translation are determined based on a point-of-view of an imaging sensor.
    Type: Grant
    Filed: September 24, 2014
    Date of Patent: April 7, 2015
    Assignee: Harris Corporation
    Inventors: Paul M. Bosscher, Matthew D. Summer, Loran J. Wilkinson, William S. Bowman
  • Publication number: 20150057803
    Abstract: Method and system for telematic control of a slave device. Displacement of a user interface control is sensed with respect to a control direction. A first directional translation is performed to convert data specifying the control direction to data specifying a slave direction. The slave direction will generally be different from the control direction and defines a direction that the slave device should move in response to the physical displacement of the user interface. A second directional translation is performed to convert data specifying haptic sensor data to a haptic feedback direction. The haptic feedback direction will generally be different from the sensed direction and can define a direction of force to be generated by at least one component of the user interface. The first and second directional translation are determined based on a point-of-view of an imaging sensor.
    Type: Application
    Filed: September 24, 2014
    Publication date: February 26, 2015
    Inventors: Paul M. Bosscher, Matthew D. Summer, Loran J. Wilkinson, William S. Bowman