Patents by Inventor Douglas Haanpaa

Douglas Haanpaa has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190137284
    Abstract: An in-vehicle system for generating precise, lane-level road map data includes a GPS receiver operative to acquire positional information associated with a track along a road path. An inertial sensor provides time local measurement of acceleration and turn rate along the track, and a camera acquires image data of the road path along the track. A processor is operative to receive the local measurement from the inertial sensor and image data from the camera over time in conjunction with multiple tracks along the road path, and improve the accuracy of the GPS receiver through curve fitting. One or all of the GPS receiver, inertial sensor and camera are disposed in a smartphone. The road map data may be uploaded to a central data repository for post processing when the vehicle passes through a WiFi cloud to generate the precise road map data, which may include data collected from multiple drivers.
    Type: Application
    Filed: November 6, 2017
    Publication date: May 9, 2019
    Inventors: Charles J. Jacobus, Glenn J. Beach, Douglas Haanpaa
  • Publication number: 20190088148
    Abstract: Autonomous and manually operated vehicles are integrated into a cohesive, interactive environment, with communications to each other and to their surroundings, to improve traffic flow while reducing accidents and other incidents. All vehicles send/receive messages to/from each other, and from infrastructure devices, enabling the vehicles to determine their status, traffic conditions and infrastructure. The vehicles store and operate in accordance with a common set of rules based upon the messages received and other inputs from sensors, databases, and so forth, to avoid obstacles and collisions based upon current and, in some cases, future or predicted behavior. Shared vehicle control interfaces enable the AVs to conform to driving activities that are legal, safe, and allowable on roadways. Such activities enable each AV to drive within safety margins, speed limits, on allowed or legal driving lanes and through allowed turns, intersections, mergers, lane changes, stops/starts, and so forth.
    Type: Application
    Filed: November 5, 2018
    Publication date: March 21, 2019
    Inventors: Charles J. Jacobus, Douglas Haanpaa, Eugene Foulk, Pritpaul Mahal, Steve Rowe, Charles J. Cohen, Glenn J. Beach
  • Publication number: 20180253099
    Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dustor other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.
    Type: Application
    Filed: May 4, 2018
    Publication date: September 6, 2018
    Inventors: Charles J. Jacobus, Douglas Haanpaa
  • Patent number: 9989967
    Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dust or other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.
    Type: Grant
    Filed: March 4, 2014
    Date of Patent: June 5, 2018
    Inventors: Charles J. Jacobus, Douglas Haanpaa
  • Publication number: 20180129220
    Abstract: Autonomous vehicles are capable of executing missions that abide by on-street rules or regulations, while also being able to seamlessly transition to and from “zones,” including off-street zones, with their our set(s) of rules or regulations. An on-board memory stores roadgraph information. An on-board computer is operative to execute commanded driving missions using the roadgraph information, including missions with one or more zones, each zone being defined by a sub-roadgraph with its own set of zone-specific driving rules and parameters. A mission may be coordinated with one or more payload operations, including zone with “free drive paths” as in a warehouse facility with loading and unloading zones to pick up payloads and place them down, or zone staging or entry points to one or more points of payload acquisition or placement. The vehicle may be a warehousing vehicle such as a forklift.
    Type: Application
    Filed: November 8, 2017
    Publication date: May 10, 2018
    Inventors: Glenn Beach, Douglas Haanpaa, Charles J. Jacobus, Steven Rowe
  • Publication number: 20170154459
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: November 1, 2016
    Publication date: June 1, 2017
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 9483867
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: November 1, 2016
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20160093097
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: July 14, 2015
    Publication date: March 31, 2016
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20160077671
    Abstract: A system and method including software to aid in generation of panels and control instruments rapidly generates a station that can support a variety of control interfaces. Rapid-Prototyped Panels, or RP-Panels, replicate existing systems (for simulation, training, gaming, etc.) or from new designs (for human factors testing, as functional product, etc.). The controls have tactile and visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. With this method a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.
    Type: Application
    Filed: November 23, 2015
    Publication date: March 17, 2016
    Inventors: Christopher R. Wagner, Amanda Christiana, Douglas Haanpaa, Charles J. Jacobus
  • Publication number: 20150339355
    Abstract: An application coherency manager (ACM) implements and manages the interdependencies of simulation, data, and platform information to simplify the task of organizing and executing large simulations composed of numerous models and data files. One or more file systems or repositories stories raw data in the form of files, data, or models, and a graphical users interface (GUI) enabling a user to enter and receive results from a query involving the files, data, or models. One or more coherency checking modules (CCMs) are operative to determine the types and versions of, and compatibility between, the files, data, or models. A database stores processed information about the file systems or repositories and the results of previous queries, and a data aggregator and manager (DAM) that manages the flow of information between the file system or repository, the GUI, the CCMs, and the database.
    Type: Application
    Filed: June 8, 2015
    Publication date: November 26, 2015
    Inventors: Douglas Haanpaa, Glenn J. Beach, Charles J. Jacobus, Devvan Stokes
  • Patent number: 9195886
    Abstract: A system and method including software to aid in generation of panels and control instruments rapidly generates a station that can support a variety of control interfaces. Rapid-Prototyped Panels, or RP-Panels, replicate existing systems (for simulation, training, gaming, etc.) or from new designs (for human factors testing, as functional product, etc.). The controls have tactile and visual characteristics similar or identical to their functional component counterparts such as buttons, knobs, switches, pedals, joysticks, steering wheels, and touch panels but are modular and use alternative data transfer modes (potentiometers, fiber optics, RFID, machine vision, etc.) to track and analyze the response of the controls. The response is then transmitted to the host programs. With this method a user can design and fabricate a reconfigurable interface to interact with virtual environments for various applications such as simulation, training, virtual instrumentation, gaming, human factors testing, etc.
    Type: Grant
    Filed: January 9, 2009
    Date of Patent: November 24, 2015
    Assignee: Cybernet Systems Corporation
    Inventors: Christopher R. Wagner, Amanda Christiana, Douglas Haanpaa, Charles J. Jacobus
  • Publication number: 20150253775
    Abstract: Autonomously driven vehicles operate in rain, snow and other adverse weather conditions. An on-board vehicle sensor has a beam with a diameter that is only intermittently blocked by rain, snow, dustor other obscurant particles. This allows an obstacle detection processor is to tell the difference between obstacles, terrain variations and obscurant particles, thereby enabling the vehicle driving control unit to disregard the presence of obscurant particles along the route taken by the vehicle. The sensor may form part of a LADAR or RADAR system or a video camera. The obstacle detection processor may receive time-spaced frames divided into cells or pixels, whereby groups of connected cells or pixels and/or cells or pixels that persist over longer periods of time are interpreted to be obstacles or terrain variations. The system may further including an input for receiving weather-specific configuration parameters to adjust the operation of the obstacle detection processor.
    Type: Application
    Filed: March 4, 2014
    Publication date: September 10, 2015
    Applicant: Cybernet Systems Corporation
    Inventors: Charles J. Jacobus, Douglas Haanpaa
  • Patent number: 9082219
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: April 1, 2014
    Date of Patent: July 14, 2015
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20140320486
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: April 1, 2014
    Publication date: October 30, 2014
    Applicant: CYBERNET SYSTEMS CORPORATION
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 8687849
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. It is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: April 3, 2012
    Date of Patent: April 1, 2014
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20120263348
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. It is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: April 3, 2012
    Publication date: October 18, 2012
    Applicant: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 8150101
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Grant
    Filed: November 12, 2007
    Date of Patent: April 3, 2012
    Assignee: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Publication number: 20110102419
    Abstract: A system for performing object identification combines pose determination, EO/IR sensor data, and novel computer graphics rendering techniques. A first module extracts the orientation and distance of a target in a truth chip given that the target type is known. A second is a module identifies the vehicle within a truth chip given the known distance and elevation angle from camera to target. Image matching is based on synthetic image and truth chip image comparison, where the synthetic image is rotated and moved through a 3-Dimensional space. To limit the search space, it is assumed that the object is positioned on relatively flat ground and that the camera roll angle stays near zero. This leaves three dimensions of motion (distance, heading, and pitch angle) to define the space in which the synthetic target is moved. A graphical user interface (GUI) front end allows the user to manually adjust the orientation of the target within the synthetic images.
    Type: Application
    Filed: November 12, 2007
    Publication date: May 5, 2011
    Applicant: Cybernet Systems Corporation
    Inventors: Douglas Haanpaa, Charles J. Cohen, Glenn J. Beach, Charles J. Jacobus
  • Patent number: 7765182
    Abstract: Methods are presented for authoring geometrical databases which incorporate touch or haptic feedback. In particular, a database of geometrical elements incorporates attributes necessary to support haptic interactions such as stiffness, hardness, friction, and so forth. Users may instantiate objects designed through CAD/CAM environments or attach haptic or touch attributes to subcomponents such as surfaces or solid sub-objects. The resulting haptic/visual databases or world-describing models can then be viewed and touched using a haptic browser or other appropriate user interface.
    Type: Grant
    Filed: January 29, 2007
    Date of Patent: July 27, 2010
    Assignee: Immersion Corporation
    Inventors: Thomas M. Peurach, Todd Yocum, Douglas Haanpaa
  • Publication number: 20100045701
    Abstract: Systems and methods expedite and improve the process of configuring an augmented reality environment. A method of pose determination according to the invention includes the step of placing at least one synthetic fiducial in a real environment to be augmented. A camera, which may include apparatus for obtaining directly measured camera location and orientation (DLMO) information, is used to acquire an image of the environment. The natural and synthetic fiducials are detected, and the pose of the camera is determined using a combination of the natural fiducials, the synthetic fiducial if visible in the image, and the DLMO information if determined to be reliable or necessary. The invention is not limited to architectural environments, and may be used with instrumented persons, animals, vehicles, and any other augmented or mixed reality applications.
    Type: Application
    Filed: August 24, 2009
    Publication date: February 25, 2010
    Applicant: Cybernet Systems Corporation
    Inventors: Katherine Scott, Douglas Haanpaa, Charles J. Jacobus