Patents by Inventor Glenn J. Beach
Glenn J. Beach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20180357601Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: ApplicationFiled: August 21, 2018Publication date: December 13, 2018Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Publication number: 20180089616Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.Type: ApplicationFiled: September 26, 2017Publication date: March 29, 2018Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Publication number: 20180012346Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: ApplicationFiled: August 13, 2017Publication date: January 11, 2018Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Patent number: 9734569Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: March 4, 2015Date of Patent: August 15, 2017Assignee: Cybernet Systems Corp.Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Publication number: 20170154459Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: ApplicationFiled: November 1, 2016Publication date: June 1, 2017Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Patent number: 9483867Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: GrantFiled: July 14, 2015Date of Patent: November 1, 2016Assignee: Cybernet Systems CorporationInventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Patent number: 9424634Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: March 15, 2013Date of Patent: August 23, 2016Assignee: Cybernet Systems CorporationInventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Publication number: 20160216772Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: ApplicationFiled: April 5, 2016Publication date: July 28, 2016Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Patent number: 9304593Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: GrantFiled: March 26, 2013Date of Patent: April 5, 2016Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20160093097Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: ApplicationFiled: July 14, 2015Publication date: March 31, 2016Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Publication number: 20160023100Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: July 28, 2015Publication date: January 28, 2016Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20150339355Abstract: An application coherency manager (ACM) implements and manages the interdependencies of simulation, data, and platform information to simplify the task of organizing and executing large simulations composed of numerous models and data files. One or more file systems or repositories stories raw data in the form of files, data, or models, and a graphical users interface (GUI) enabling a user to enter and receive results from a query involving the files, data, or models. One or more coherency checking modules (CCMs) are operative to determine the types and versions of, and compatibility between, the files, data, or models. A database stores processed information about the file systems or repositories and the results of previous queries, and a data aggregator and manager (DAM) that manages the flow of information between the file system or repository, the GUI, the CCMs, and the database.Type: ApplicationFiled: June 8, 2015Publication date: November 26, 2015Inventors: Douglas Haanpaa, Glenn J. Beach, Charles J. Jacobus, Devvan Stokes
-
Patent number: 9082219Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: GrantFiled: April 1, 2014Date of Patent: July 14, 2015Assignee: Cybernet Systems CorporationInventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Publication number: 20150178913Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: ApplicationFiled: March 4, 2015Publication date: June 25, 2015Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Patent number: 8983173Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: January 24, 2014Date of Patent: March 17, 2015Assignee: Cybernet Systems CorporationInventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Patent number: 8965561Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: GrantFiled: March 15, 2013Date of Patent: February 24, 2015Assignee: Cybernet Systems CorporationInventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Publication number: 20140320486Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: ApplicationFiled: April 1, 2014Publication date: October 30, 2014Applicant: CYBERNET SYSTEMS CORPORATIONInventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Publication number: 20140260925Abstract: A fully automated, integrated, end-to-end synchronized and compact manufacturing system produces polymer or metal case ammunition. Manufacturing stations support case assembly, sealing (gluing/welding), final product inspection, cartridge packaging or binning, and loading. Station modularity facilitates rapid changeover to accommodate ammunition of differing calibers. Sensors and apparatus may be provided to place a manufacturing cell in a wait state until all components or materials are received in a preferred orientation for proper assembly. The system may join and use multipart cases, each including a lower portion with a head end attached thereto and at least one upper portion having a necked-down transition to the open top end. Elevator feeders, vibratory bowl feeders, and robotic pick-and-place feeders may be used to deliver components for assembly.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: CYBERNET SYSTEMS CORPORATIONInventors: Glenn J. Beach, James Burkowski, Amanda Christiana, Trevor Davey, Charles J. Jacobus, Joseph Long, Gary Moody, Gary Siebert
-
Publication number: 20140277691Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: CYBERNET SYSTEMS CORPORATIONInventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Patent number: RE47108Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: GrantFiled: February 22, 2017Date of Patent: October 30, 2018Assignee: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe