Patents by Inventor Charles J. Jacobus
Charles J. Jacobus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20190137284Abstract: An in-vehicle system for generating precise, lane-level road map data includes a GPS receiver operative to acquire positional information associated with a track along a road path. An inertial sensor provides time local measurement of acceleration and turn rate along the track, and a camera acquires image data of the road path along the track. A processor is operative to receive the local measurement from the inertial sensor and image data from the camera over time in conjunction with multiple tracks along the road path, and improve the accuracy of the GPS receiver through curve fitting. One or all of the GPS receiver, inertial sensor and camera are disposed in a smartphone. The road map data may be uploaded to a central data repository for post processing when the vehicle passes through a WiFi cloud to generate the precise road map data, which may include data collected from multiple drivers.Type: ApplicationFiled: November 6, 2017Publication date: May 9, 2019Inventors: Charles J. Jacobus, Glenn J. Beach, Douglas Haanpaa
-
Patent number: 10275873Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: August 13, 2017Date of Patent: April 30, 2019Assignee: Cybernet Systems Corp.Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Publication number: 20190088148Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 5, 2018Publication date: March 21, 2019Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20180357601Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: ApplicationFiled: August 21, 2018Publication date: December 13, 2018Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Publication number: 20180253099Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dustor other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.Type: ApplicationFiled: May 4, 2018Publication date: September 6, 2018Inventors: Charles J. Jacobus, Douglas Haanpaa
-
Patent number: 9989967Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dust or other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.Type: GrantFiled: March 4, 2014Date of Patent: June 5, 2018Inventors: Charles J. Jacobus, Douglas Haanpaa
-
Publication number: 20180129220Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.Type: ApplicationFiled: November 8, 2017Publication date: May 10, 2018Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
-
Publication number: 20180089616Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.Type: ApplicationFiled: September 26, 2017Publication date: March 29, 2018Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
-
Publication number: 20180012346Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: ApplicationFiled: August 13, 2017Publication date: January 11, 2018Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Patent number: 9734569Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: March 4, 2015Date of Patent: August 15, 2017Assignee: Cybernet Systems Corp.Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Publication number: 20170154459Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: ApplicationFiled: November 1, 2016Publication date: June 1, 2017Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Patent number: 9483867Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: GrantFiled: July 14, 2015Date of Patent: November 1, 2016Assignee: Cybernet Systems CorporationInventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Patent number: 9473314Abstract: A very large number of applications communicate logically through a many-to-many multicast cloud on the common carrier Internet. Three types of systems operate together to implement the method. The first is a network enabled client application, such as a distributed simulation or game, which joins an application cloud or federation and communicates its internal state changes into the cloud via a communication applications programming interface. The second is a lobby manager or broker which accepts entry into a communication cloud or federation and provides information to the federation and the client application for establishing communications between them. And third, is an application-specific routing system which provides the normal function of routing packets between Internet hosts (client applications running on these hosts), but also allows the routing functions to affected by modules in the router which are associated with the distributed application or simulation being implemented.Type: GrantFiled: June 11, 2013Date of Patent: October 18, 2016Assignee: Cybemet Systems CorporationInventor: Charles J. Jacobus
-
Patent number: 9424634Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.Type: GrantFiled: March 15, 2013Date of Patent: August 23, 2016Assignee: Cybernet Systems CorporationInventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
-
Publication number: 20160216772Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: ApplicationFiled: April 5, 2016Publication date: July 28, 2016Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Patent number: 9304593Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: GrantFiled: March 26, 2013Date of Patent: April 5, 2016Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20160093097Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.Type: ApplicationFiled: July 14, 2015Publication date: March 31, 2016Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
-
Publication number: 20160077671Abstract: A system and method including software to aid in generation of panels and control instruments rapidly generates a station that can support a variety of control interfaces. Rapid-Prototyped Panels, or RP-Panels, replicate existing systems (for simulation, training, gaming, etc.) or from new designs (for human factors testing, as functional product, etc.). The controls have tactile and visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. With this method a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.Type: ApplicationFiled: November 23, 2015Publication date: March 17, 2016Inventors: Christopher R. Wagner, Amanda Christiana, Douglas Haanpaa, Charles J. Jacobus
-
Publication number: 20160023100Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: July 28, 2015Publication date: January 28, 2016Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: RE47108Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.Type: GrantFiled: February 22, 2017Date of Patent: October 30, 2018Assignee: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe