Patents by Inventor Eugene Foulk
Eugene Foulk has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210248915Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 25, 2020Publication date: August 12, 2021Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20210110726Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 25, 2020Publication date: April 15, 2021Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20210104165Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 25, 2020Publication date: April 8, 2021Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20210082297Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 25, 2020Publication date: March 18, 2021Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20210082296Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 25, 2020Publication date: March 18, 2021Applicant: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Patent number: 10909866Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: GrantFiled: November 5, 2018Date of Patent: February 2, 2021Assignee: Cybernet Systems Corp.Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20190088148Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.Type: ApplicationFiled: November 5, 2018Publication date: March 21, 2019Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
-
Publication number: 20160216772Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: ApplicationFiled: April 5, 2016Publication date: July 28, 2016Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Patent number: 9304593Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: GrantFiled: March 26, 2013Date of Patent: April 5, 2016Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20140132729Abstract: Systems and methods facilitate long-range, accurate fiducial tracking using a mixture of pan-tilt and camera devices enabling generalized 3D tracking of fiducials with the automatic mapping of flaw data to component models within standard CAD packages. The invention is suitable to many various tracking applications, particularly large inspection sites such as aircraft surfaces which require vast coverage with a medium-degree of accuracy. A method of surface inspection comprises the steps of moving a fiducial target over a surface under inspection, and tracking the fiducial as it is moved by capturing and storing the coordinates of the fiducial in a database for subsequent retrieval. Machine vision is used to acquire surface inspection data associated with the coordinates of the fiducial as it is moved. The inspection data is integrated into a CAD model, enabling the use of finite element analysis (FEA) to determine or predict flaw and material behavior over time.Type: ApplicationFiled: November 15, 2013Publication date: May 15, 2014Applicant: CYBERNET SYSTEMS CORPORATIONInventors: Eugene Foulk, Kevin Tang, Glenn J. Beach
-
Publication number: 20110314331Abstract: An intelligent system for automatically monitoring, diagnosing, and repairing complex hardware and software systems is presented. A number of functional modules enable the system to collect relevant data from both hardware and software components, analyze the incoming data to detect faults, further monitor sensor data and historical knowledge to predict potential faults, determine an appropriate response to fix the faults, and finally automatically repair the faults when appropriate. The system leverages both software and hardware modules to interact with the complex system being monitored. Additionally, the lessons learned on one system can be applied to better understand events occurring on the same or similar systems.Type: ApplicationFiled: October 29, 2010Publication date: December 22, 2011Applicant: Cybernet Systems CorporationInventors: Glenn J. Beach, Kevin Tang, Chris C. Lomont, Ryan O'Grady, Gary Moody, Eugene Foulk, Charles J. Jacobus
-
Patent number: 7668340Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: GrantFiled: December 2, 2008Date of Patent: February 23, 2010Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20090027334Abstract: A method for controlling a graphical user interface (GUI) for a touchscreen-enabled computer systems provides a variety of software methods (tools) provide for high-fidelity control of the user interface. The TrackScreen tool provides finger-friendly mouse functions such as scrolling, dragging and clicking. The Magnifier application continuously captures the current screen image, and displays a magnified subset of it. Selecting within this magnified area with a pointing device (mouse, touchscreen, digitizer, etc) causes the application to simulate the action on the portion of the screen corresponding to the point in the magnified image that was selected. A KeyBoard application, a keyboard is rendered on screen, with sufficient size that the individual keys are easily selectable with an unaided finger. The Common Tasks Tool or CTT) allows common keyboard shortcuts, mouse events, and other user interface events to be specified in a configuration file and represented on screen as a large, easy-to-click button.Type: ApplicationFiled: June 2, 2008Publication date: January 29, 2009Applicant: Cybernet Systems CorporationInventors: Eugene Foulk, Ronald Hay, Katherine Scott, Merrill D. Squiers, Joseph Tesar, Charles J. Cohen, Charles J. Jacobus