Patents by Inventor Glenn Beach

Glenn Beach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200342748
    Abstract: A monitoring system that is configured to monitor a property is disclosed. The monitoring system includes a passive infrared (PIR) sensor configured to generate reference PIR data that represents motion within an area of the property; an auxiliary sensor configured to generate auxiliary sensor data that represents an attribute of the area of the property; and a motion sensor device. The motion sensor device is configured to: obtain the reference PIR data; determine that a first set of motion detection criteria is satisfied by the reference PIR data; in response to determining that the first set of motion detection criteria is satisfied by the reference PIR data, obtain the auxiliary sensor data; obtain a second set of motion detection criteria based on the reference PIR data and the auxiliary sensor data; and determine whether the second set of motion detection criteria is satisfied by additional PIR data.
    Type: Application
    Filed: April 28, 2020
    Publication date: October 29, 2020
    Inventors: Glenn Tournier, Alexander Lawrence Reeder, Donald Gerard Madden, Allison Beach, David James Hutz
  • Publication number: 20200057452
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: October 10, 2019
    Publication date: February 20, 2020
    Applicant: Cybernet Systems Corp.
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Patent number: 10459453
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Grant
    Filed: November 8, 2017
    Date of Patent: October 29, 2019
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20180129220
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: November 8, 2017
    Publication date: May 10, 2018
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Patent number: 8407625
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: April 25, 2006
    Date of Patent: March 26, 2013
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20090274339
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 25, 2006
    Publication date: November 5, 2009
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Publication number: 20090074248
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: December 2, 2008
    Publication date: March 19, 2009
    Applicant: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Patent number: 7460690
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: September 14, 2005
    Date of Patent: December 2, 2008
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20070195997
    Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.
    Type: Application
    Filed: May 23, 2006
    Publication date: August 23, 2007
    Inventors: George Paul, Glenn Beach, Charles Cohen, Charles Jacobus
  • Publication number: 20070066393
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: October 17, 2006
    Publication date: March 22, 2007
    Applicant: Cybernet Systems Corporation
    Inventors: George Paul, Glenn Beach, Charles Cohen, Charles Jacobus
  • Publication number: 20060210112
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 27, 2006
    Publication date: September 21, 2006
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Patent number: 7036094
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Where in the previous patent only one gesture was recognized at a time, in this system, multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: March 31, 2000
    Date of Patent: April 25, 2006
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20060013440
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: September 14, 2005
    Publication date: January 19, 2006
    Inventors: Charles Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles Jacobus, Jay Obermark, George Paul
  • Publication number: 20050226489
    Abstract: A machine vision system for automatically identifying and inspecting objects is disclosed, including composable vision-based recognition modules and a decision algorithm to perform the final determination on object type and quality. This vision system has been used to develop a Projectile Identification System and an Automated Tactical Ammunition Classification System. The technology can be used to create numerous other inspection and automated identification systems.
    Type: Application
    Filed: March 4, 2005
    Publication date: October 13, 2005
    Inventors: Glenn Beach, Gary Moody, James Burkowski, Charles Jacobus
  • Patent number: 6950534
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: January 16, 2004
    Date of Patent: September 27, 2005
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20040161132
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: January 16, 2004
    Publication date: August 19, 2004
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Patent number: 6681031
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: August 10, 1999
    Date of Patent: January 20, 2004
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
  • Publication number: 20030138130
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Application
    Filed: August 10, 1999
    Publication date: July 24, 2003
    Inventors: CHARLES J. COHEN, GLENN BEACH, BROOK CAVELL, GENE FOULK, CHARLES J. JACOBUS, JAY OBERMARK, GEORGE PAUL
  • Patent number: 5509756
    Abstract: An end connector for use with oil containment booms, wherein the connector design utilizes a stress distribution technique which avoids the need for proximal end flanges, internal cavity traction surfaces, and further avoids bolts, rivets, or other fasteners through the boom fin end. The end connector design has a smaller size than conventional end connector designs with comparable tensile strength capabilities. Further, the end connector may be fabricated faster than conventional end connectors, reduces connector cost, and reduces final boom assembly time.
    Type: Grant
    Filed: March 4, 1994
    Date of Patent: April 23, 1996
    Assignee: TCOM, L.P.
    Inventors: Chun C. Chou, Joe Falk, Robert Ferguson, Glenn Beach