Patents by Inventor George V. Paul
George V. Paul has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9592609Abstract: An intelligent mobile robot having a robot base controller and an onboard navigation system that, in response to receiving a job assignment specifying a job location that is associated with one or more job operations, activates the onboard navigation system to automatically determine a path the mobile robot should use to drive to the job location, automatically determines that using an initially-selected path could cause the mobile robot to run into stationary or non-stationary obstacles, such as people or other mobile robots, in the physical environment, automatically determines a new path to avoid the stationary and non-stationary obstacles, and automatically drives the mobile robot to the job location using the new path, thereby avoiding contact or collisions with those obstacles. After the mobile robot arrives at the job location, it automatically performs said one or more job operations associated with that job location.Type: GrantFiled: January 25, 2013Date of Patent: March 14, 2017Assignee: Omron Adept Technologies, Inc.Inventors: Matthew LaFary, Matthew Vestal, George V. Paul
-
Publication number: 20160216772Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: ApplicationFiled: April 5, 2016Publication date: July 28, 2016Inventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Patent number: 9304593Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: GrantFiled: March 26, 2013Date of Patent: April 5, 2016Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20160023100Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: July 28, 2015Publication date: January 28, 2016Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20140350725Abstract: An intelligent mobile robot having a robot base controller and an onboard navigation system that, in response to receiving a job assignment specifying a job location that is associated with one or more job operations, activates the onboard navigation system to automatically determine a path the mobile robot should use to drive to the job location, automatically determines that using an initially-selected path could cause the mobile robot to run into stationary or non-stationary obstacles, such as people or other mobile robots, in the physical environment, automatically determines a new path to avoid the stationary and non-stationary obstacles, and automatically drives the mobile robot to the job location using the new path, thereby avoiding contact or collisions with those obstacles. After the mobile robot arrives at the job location, it automatically performs said one or more job operations associated with that job location.Type: ApplicationFiled: January 25, 2013Publication date: November 27, 2014Applicant: ADEPT TECHNOLOGY, INC.Inventors: Matthew LaFary, Matthew Vestal, George V. Paul
-
Patent number: 7684592Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: GrantFiled: January 14, 2008Date of Patent: March 23, 2010Assignee: Cybernet Systems CorporationInventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7668340Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: GrantFiled: December 2, 2008Date of Patent: February 23, 2010Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn J. Beach, Brook Cavell, Eugene Foulk, Charles J. Jacobus, Jay Obermark, George V. Paul
-
Publication number: 20090116692Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: January 14, 2008Publication date: May 7, 2009Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7121946Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: GrantFiled: June 29, 2001Date of Patent: October 17, 2006Assignee: Cybernet Systems CorporationInventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7050606Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.Type: GrantFiled: November 1, 2001Date of Patent: May 23, 2006Assignee: Cybernet Systems CorporationInventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20020126876Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.Type: ApplicationFiled: November 1, 2001Publication date: September 12, 2002Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20020037770Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: June 29, 2001Publication date: March 28, 2002Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20010008561Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: March 2, 2001Publication date: July 19, 2001Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus