Patents by Inventor Glenn Beach

Glenn Beach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11914395
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Grant
    Filed: June 26, 2022
    Date of Patent: February 27, 2024
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20230219597
    Abstract: An autonomous vehicle (an AV, or manual vehicle in an autonomous or semi-autonomous mode) includes the ability to sense a command from a source external to the vehicle and modify the behavior of the vehicle in accordance with the command. For example, the vehicle may visualize a police officer or other person associated with traffic control and interpret gestures made by the person causing the vehicle to stop, slow down, pull over, change lanes, back up or take a different route due to unplanned traffic patterns such as accidents, harsh weather, road closings or other situations. The system and method may also be used for non-emergency purposes, including external guidance for load pick-up/placement, hailing a vehicle used as a cab, and so forth. The command may further be spoken or may include a radio frequency (RF) light or other energy component.
    Type: Application
    Filed: November 22, 2022
    Publication date: July 13, 2023
    Inventors: Charles J. Cohen, Charles Jacobus, Glenn Beach, George Paul
  • Publication number: 20220326717
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: June 26, 2022
    Publication date: October 13, 2022
    Applicant: Cybernet Systems Corp.
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Patent number: 11372425
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Grant
    Filed: October 10, 2019
    Date of Patent: June 28, 2022
    Assignee: Cybernet Systems Corp.
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20200057452
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: October 10, 2019
    Publication date: February 20, 2020
    Applicant: Cybernet Systems Corp.
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Patent number: 10459453
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Grant
    Filed: November 8, 2017
    Date of Patent: October 29, 2019
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20180129220
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: November 8, 2017
    Publication date: May 10, 2018
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Patent number: 8407625
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: April 25, 2006
    Date of Patent: March 26, 2013
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20090274339
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 25, 2006
    Publication date: November 5, 2009
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Publication number: 20090074248
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: December 2, 2008
    Publication date: March 19, 2009
    Applicant: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Patent number: 7460690
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: September 14, 2005
    Date of Patent: December 2, 2008
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20070195997
    Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.
    Type: Application
    Filed: May 23, 2006
    Publication date: August 23, 2007
    Inventors: George Paul, Glenn Beach, Charles Cohen, Charles Jacobus
  • Publication number: 20070066393
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: October 17, 2006
    Publication date: March 22, 2007
    Applicant: Cybernet Systems Corporation
    Inventors: George Paul, Glenn Beach, Charles Cohen, Charles Jacobus
  • Publication number: 20060210112
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 27, 2006
    Publication date: September 21, 2006
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Patent number: 7036094
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Where in the previous patent only one gesture was recognized at a time, in this system, multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: March 31, 2000
    Date of Patent: April 25, 2006
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20060013440
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: September 14, 2005
    Publication date: January 19, 2006
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Publication number: 20050226489
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Application
    Filed: March 4, 2005
    Publication date: October 13, 2005
    Inventors: Glenn Beach, Gary Moody, James Burkowski, Charles Jacobus
  • Patent number: 6950534
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: January 16, 2004
    Date of Patent: September 27, 2005
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20040161132
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: January 16, 2004
    Publication date: August 19, 2004
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Patent number: 6681031
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: August 10, 1999
    Date of Patent: January 20, 2004
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul