Patents by Inventor Glenn J. Beach

Glenn J. Beach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10726544
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: March 14, 2019
    Date of Patent: July 28, 2020
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20200223068
    Abstract: In an automated system and method uses a multi-axis robot arm and computer vision system to perform critical demil processes using a plurality of networked workcells monitored and managed by a central processor. An inspection and paint removal cell strips paint or other coatings from the outer surface of the ordnance for disposal of the paint or other coatings. A defusing cell is operative to remove a fuse from the ordinance for disposal of the fuse, and a cutting and definning cell operative to remove fins from the body of the ordnance, and cut into the body of the ordnance to determine if submunitions are present in the ordnance. A multi-axis robot arm and computer vision system removes submunitions from the ordnance, if present, inspecting the submunitions, and transferring the submunitions to the cutting and definning cell for subsequent processing.
    Type: Application
    Filed: January 10, 2019
    Publication date: July 16, 2020
    Applicant: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, James Burkowski, Joseph Long, Gary Moody, Gary Siebert
  • Publication number: 20190333012
    Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.
    Type: Application
    Filed: July 9, 2019
    Publication date: October 31, 2019
    Applicant: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20190213728
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Application
    Filed: March 14, 2019
    Publication date: July 11, 2019
    Applicant: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Patent number: 10346797
    Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.
    Type: Grant
    Filed: September 26, 2017
    Date of Patent: July 9, 2019
    Assignee: Cybernet Systems, Inc.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20190137284
    Abstract: An in-vehicle system for generating precise, lane-level road map data includes a GPS receiver operative to acquire positional information associated with a track along a road path. An inertial sensor provides time local measurement of acceleration and turn rate along the track, and a camera acquires image data of the road path along the track. A processor is operative to receive the local measurement from the inertial sensor and image data from the camera over time in conjunction with multiple tracks along the road path, and improve the accuracy of the GPS receiver through curve fitting. One or all of the GPS receiver, inertial sensor and camera are disposed in a smartphone. The road map data may be uploaded to a central data repository for post processing when the vehicle passes through a WiFi cloud to generate the precise road map data, which may include data collected from multiple drivers.
    Type: Application
    Filed: November 6, 2017
    Publication date: May 9, 2019
    Inventors: Charles J. Jacobus, Glenn J. Beach, Douglas Haanpaa
  • Patent number: 10275873
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: August 13, 2017
    Date of Patent: April 30, 2019
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20190088148
    Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.
    Type: Application
    Filed: November 5, 2018
    Publication date: March 21, 2019
    Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
  • Publication number: 20180357601
    Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.
    Type: Application
    Filed: August 21, 2018
    Publication date: December 13, 2018
    Applicant: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20180089616
    Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.
    Type: Application
    Filed: September 26, 2017
    Publication date: March 29, 2018
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20180012346
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Application
    Filed: August 13, 2017
    Publication date: January 11, 2018
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Patent number: 9734569
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: March 4, 2015
    Date of Patent: August 15, 2017
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20170154459
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: November 1, 2016
    Publication date: June 1, 2017
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 9483867
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: November 1, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 9424634
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: August 23, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20160216772
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 5, 2016
    Publication date: July 28, 2016
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Patent number: 9304593
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: March 26, 2013
    Date of Patent: April 5, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Publication number: 20160093097
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: July 14, 2015
    Publication date: March 31, 2016
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20160023100
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: July 28, 2015
    Publication date: January 28, 2016
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Patent number: RE47108
    Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.
    Type: Grant
    Filed: February 22, 2017
    Date of Patent: October 30, 2018
    Assignee: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe