Patents by Inventor George V. Paul

George V. Paul has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9592609
    Abstract: An intelligent mobile robot having a robot base controller and an onboard navigation system that, in response to receiving a job assignment specifying a job location that is associated with one or more job operations, activates the onboard navigation system to automatically determine a path the mobile robot should use to drive to the job location, automatically determines that using an initially-selected path could cause the mobile robot to run into stationary or non-stationary obstacles, such as people or other mobile robots, in the physical environment, automatically determines a new path to avoid the stationary and non-stationary obstacles, and automatically drives the mobile robot to the job location using the new path, thereby avoiding contact or collisions with those obstacles. After the mobile robot arrives at the job location, it automatically performs said one or more job operations associated with that job location.
    Type: Grant
    Filed: January 25, 2013
    Date of Patent: March 14, 2017
    Assignee: Omron Adept Technologies, Inc.
    Inventors: Matthew LaFary, Matthew Vestal, George V. Paul
  • Publication number: 20160216772
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Application
    Filed: April 5, 2016
    Publication date: July 28, 2016
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Patent number: 9304593
    Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.
    Type: Grant
    Filed: March 26, 2013
    Date of Patent: April 5, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Publication number: 20160023100
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: July 28, 2015
    Publication date: January 28, 2016
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Publication number: 20140350725
    Abstract: An intelligent mobile robot having a robot base controller and an onboard navigation system that, in response to receiving a job assignment specifying a job location that is associated with one or more job operations, activates the onboard navigation system to automatically determine a path the mobile robot should use to drive to the job location, automatically determines that using an initially-selected path could cause the mobile robot to run into stationary or non-stationary obstacles, such as people or other mobile robots, in the physical environment, automatically determines a new path to avoid the stationary and non-stationary obstacles, and automatically drives the mobile robot to the job location using the new path, thereby avoiding contact or collisions with those obstacles. After the mobile robot arrives at the job location, it automatically performs said one or more job operations associated with that job location.
    Type: Application
    Filed: January 25, 2013
    Publication date: November 27, 2014
    Applicant: ADEPT TECHNOLOGY, INC.
    Inventors: Matthew LaFary, Matthew Vestal, George V. Paul
  • Patent number: 7684592
    Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Grant
    Filed: January 14, 2008
    Date of Patent: March 23, 2010
    Assignee: Cybernet Systems Corporation
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Patent number: 7668340
    Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.
    Type: Grant
    Filed: December 2, 2008
    Date of Patent: February 23, 2010
    Assignee: Cybernet Systems Corporation
    Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
  • Publication number: 20090116692
    Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: January 14, 2008
    Publication date: May 7, 2009
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Patent number: 7121946
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Grant
    Filed: June 29, 2001
    Date of Patent: October 17, 2006
    Assignee: Cybernet Systems Corporation
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Patent number: 7050606
    Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.
    Type: Grant
    Filed: November 1, 2001
    Date of Patent: May 23, 2006
    Assignee: Cybernet Systems Corporation
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Publication number: 20020126876
    Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.
    Type: Application
    Filed: November 1, 2001
    Publication date: September 12, 2002
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Publication number: 20020037770
    Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: June 29, 2001
    Publication date: March 28, 2002
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
  • Publication number: 20010008561
    Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.
    Type: Application
    Filed: March 2, 2001
    Publication date: July 19, 2001
    Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus