Patents by Inventor Paul Alex Dow

Paul Alex Dow has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8693729
    Abstract: The present invention creates and stores target representations in several coordinate representations based on biologically inspired models of the human vision system. By using biologically inspired target representations a computer can be programmed for robot control without using kinematics to relate a target position in camera eyes to a target position in body or head coordinates. The robot sensors and appendages are open loop controlled to focus on the target. In addition, the invention herein teaches a scenario and method to learn the mappings between coordinate representations using existing machine learning techniques such as Locally Weighted Projection Regression.
    Type: Grant
    Filed: August 14, 2012
    Date of Patent: April 8, 2014
    Assignee: HRL Laboratories, LLC
    Inventors: Paul Alex Dow, Deepak Khosla, David J Huber
  • Patent number: 8396249
    Abstract: A method and apparatus for controlling robots based on prioritized targets extracted from fused visual and auditory saliency maps. The fused visual and auditory saliency map may extend beyond the immediate visual range of the robot yet the methods herein allow the robot to maintain an awareness of targets outside the immediate visual range. The fused saliency map may be derived in a bottom-up or top-down approach and may be feature based or object based.
    Type: Grant
    Filed: December 23, 2008
    Date of Patent: March 12, 2013
    Assignee: HRL Laboratories, LLC
    Inventors: Deepak Khosla, Paul Alex Dow, David Huber
  • Patent number: 8396282
    Abstract: The present disclosure describes a fused saliency map from visual and auditory saliency maps. The saliency maps are in azimuth and elevation coordinates. The auditory saliency map is based on intensity, frequency and temporal conspicuity maps. Once the auditory saliency map is determined, the map is converted into azimuth and elevation coordinates by processing selected snippets of sound from each of four microphones arranged on a robot head to detect the location of the sound source generating the saliencies.
    Type: Grant
    Filed: October 31, 2008
    Date of Patent: March 12, 2013
    Assignee: HRL Labortories, LLC
    Inventors: David J Huber, Deepak Khosla, Paul Alex Dow
  • Patent number: 8311317
    Abstract: The present invention creates and stores target representations in several coordinate representations based on biologically inspired models of the human vision system. By using biologically inspired target representations a computer can be programmed for robot control without using kinematics to relate a target position in camera eyes to a target position in body or head coordinates. The robot sensors and appendages are open loop controlled to focus on the target. In addition, the invention herein teaches a scenario and method to learn the mappings between coordinate representations using existing machine learning techniques such as Locally Weighted Projection Regression.
    Type: Grant
    Filed: August 15, 2008
    Date of Patent: November 13, 2012
    Assignee: HRL Laboratories, LLC
    Inventors: Paul Alex Dow, Deepak Khosla, David J Huber