Patents by Inventor Charles J. Jacobus

Charles J. Jacobus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190137284
    Abstract: An in-vehicle system for generating precise, lane-level road map data includes a GPS receiver operative to acquire positional information associated with a track along a road path. An inertial sensor provides time local measurement of acceleration and turn rate along the track, and a camera acquires image data of the road path along the track. A processor is operative to receive the local measurement from the inertial sensor and image data from the camera over time in conjunction with multiple tracks along the road path, and improve the accuracy of the GPS receiver through curve fitting. One or all of the GPS receiver, inertial sensor and camera are disposed in a smartphone. The road map data may be uploaded to a central data repository for post processing when the vehicle passes through a WiFi cloud to generate the precise road map data, which may include data collected from multiple drivers.
    Type: Application
    Filed: November 6, 2017
    Publication date: May 9, 2019
    Inventors: Charles J. Jacobus, Glenn J. Beach, Douglas Haanpaa
  • Patent number: 10275873
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: August 13, 2017
    Date of Patent: April 30, 2019
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20190088148
    Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.
    Type: Application
    Filed: November 5, 2018
    Publication date: March 21, 2019
    Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
  • Publication number: 20180357601
    Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.
    Type: Application
    Filed: August 21, 2018
    Publication date: December 13, 2018
    Applicant: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20180253099
    Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dustor other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.
    Type: Application
    Filed: May 4, 2018
    Publication date: September 6, 2018
    Inventors: Charles J. Jacobus, Douglas Haanpaa
  • Patent number: 9989967
    Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dust or other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.
    Type: Grant
    Filed: March 4, 2014
    Date of Patent: June 5, 2018
    Inventors: Charles J. Jacobus, Douglas Haanpaa
  • Publication number: 20180129220
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: November 8, 2017
    Publication date: May 10, 2018
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20180089616
    Abstract: Automated inventory management and material (or container) handling removes the requirement to operate fully automatically or all-manual using conventional task dedicated vertical storage and retrieval (S&R) machines. Inventory requests Automated vehicles plan their own movements to execute missions over a container yard, warehouse aisles or roadways, sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed. They identify, localize, and either pick-up loads (pallets, container, etc.) or drop them at the correctly demined locations. Systems without full automation can also implement partially automated operations (for instance load pick-up and drop), and can assure inherently safe manually operated vehicles (i.
    Type: Application
    Filed: September 26, 2017
    Publication date: March 29, 2018
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe
  • Publication number: 20180012346
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Application
    Filed: August 13, 2017
    Publication date: January 11, 2018
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Patent number: 9734569
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: March 4, 2015
    Date of Patent: August 15, 2017
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20170154459
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: November 1, 2016
    Publication date: June 1, 2017
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 9483867
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: November 1, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 9473314
    Abstract: A very large number of applications communicate logically through a many-to-many multicast cloud on the common carrier Internet. Three types of systems operate together to implement the method. The first is a network enabled client application, such as a distributed simulation or game, which joins an application cloud or federation and communicates its internal state changes into the cloud via a communication applications programming interface. The second is a lobby manager or broker which accepts entry into a communication cloud or federation and provides information to the federation and the client application for establishing communications between them. And third, is an application-specific routing system which provides the normal function of routing packets between Internet hosts (client applications running on these hosts), but also allows the routing functions to affected by modules in the router which are associated with the distributed application or simulation being implemented.
    Type: Grant
    Filed: June 11, 2013
    Date of Patent: October 18, 2016
    Assignee: Cybemet Systems Corporation
    Inventor: Charles J. Jacobus
  • Patent number: 9424634
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: August 23, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Glenn J. Beach, Gary Moody, James Burkowski, Charles J. Jacobus
  • Publication number: 20160216772
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 5, 2016
    Publication date: July 28, 2016
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Patent number: 9304593
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: March 26, 2013
    Date of Patent: April 5, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Publication number: 20160093097
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: July 14, 2015
    Publication date: March 31, 2016
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20160077671
    Abstract: A system and method including software to aid in generation of panels and control instruments rapidly generates a station that can support a variety of control interfaces. Rapid-Prototyped Panels, or RP-Panels, replicate existing systems (for simulation, training, gaming, etc.) or from new designs (for human factors testing, as functional product, etc.). The controls have tactile and visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. With this method a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.
    Type: Application
    Filed: November 23, 2015
    Publication date: March 17, 2016
    Inventors: Christopher R. Wagner, Amanda Christiana, Douglas Haanpaa, Charles J. Jacobus
  • Publication number: 20160023100
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: July 28, 2015
    Publication date: January 28, 2016
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Patent number: RE47108
    Abstract: A system for automated inventory management and material handling removes the requirement to operate fully automatically or all-manual using conventional vertical storage and retrieval (S&R) machines. Inventory requests to place palletized material into storage at a specified lot location or retrieve palletized material from a specified lot are resolved into missions for autonomous fork trucks, equivalent mobile platforms, or manual fork truck drivers (and their equipment) that are autonomously or manually executed to effect the request. Automated trucks plan their own movements to execute the mission over the warehouse aisles or roadways sharing this space with manually driven trucks. Automated units drive to planned speed limits, manage their loads (stability control), stop, go, and merge at intersections according human driving rules, use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop until potential collision risk is removed.
    Type: Grant
    Filed: February 22, 2017
    Date of Patent: October 30, 2018
    Assignee: Cybernet Systems Corp.
    Inventors: Charles J. Jacobus, Glenn J. Beach, Steve Rowe