Patents by Inventor Javier Castellar

Javier Castellar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8706719
    Abstract: A user's physical location and the time the user is at that location are sampled by an network-enabled mobile computing device at certain intervals, and used to construct a representation of the comparative relevance of the different physical locations where the user lives. This representation is then used then to optimize and prioritize results returned by a local network search operation, informing the user about which search result would be optimal for the user at her intended time for use of that choice.
    Type: Grant
    Filed: August 10, 2010
    Date of Patent: April 22, 2014
    Assignee: Aechelon Technology, Inc.
    Inventors: Ignacio Sanz-Pastor, David L. Morgan, Javier Castellar, Luis A. Barcena, Christopher E. Blumenthal
  • Patent number: 8436855
    Abstract: Efficient determination of illumination over large 3D environments, including shadowing, is provided. Illumination, including shadows, is generated using a raster elevation map by a lighting solver. The lighting solver fetches the raster elevation map for an illumination area of interest at the paging rate and produces an illumination map that is applied to terrain and features by a 3D renderer. The lighting solver updates subsets of the illumination map as necessary to reflect changing illumination or movement of the visual area of interest.
    Type: Grant
    Filed: February 19, 2008
    Date of Patent: May 7, 2013
    Assignee: Aechelon Technology, Inc.
    Inventors: David L. Morgan, Ignacio Sanz-Pastor, Javier Castellar
  • Patent number: 8280405
    Abstract: A wireless networked device incorporating a display, a video camera and a geo-location system receives geo-located data messages from a server system. Messages can be viewed by panning the device, revealing the message's real world location as icons and text overlaid on top of the camera input on the display. The user can reply to the message from her location, add data to an existing message at its original location, send new messages to other users of the system or place a message at a location for other users. World Wide Web geo-located data can be explored using the system's user interface as a browser. The server system uses the physical location of the receiving device to limit messages and data sent to each device according to range and filtering criteria, and can determine line of sight between the device and each actual message to simulate occlusion effects.
    Type: Grant
    Filed: December 29, 2006
    Date of Patent: October 2, 2012
    Assignee: Aechelon Technology, Inc.
    Inventors: Ignacio Sanz-Pastor, David L. Morgan, III, Javier Castellar
  • Patent number: 8203503
    Abstract: Sensor independent display characterization system spectrally characterizes a display system to measure radiant power emitted by the display system that displays a video image to a trainee pilot during sensor stimulation. A sensor spectral response for each wavelength produced by the stimulated sensor is determined. A stimulated luminance for each color level of the displayed image or for a range of color levels is computed. A color look up table that maps computed stimulated luminance to a set of stimulating color values is generated. When a trainee pilot looks at the displayed image using a sensor having a sensor response that was used in computing the stimulated luminance, the pilot will see an image that was created by simulated spectral rendering. The displayed image is an accurate, display and sensor independent image that the pilot can see during the real flight.
    Type: Grant
    Filed: September 8, 2005
    Date of Patent: June 19, 2012
    Assignee: Aechelon Technology, Inc.
    Inventors: Javier Castellar, David Lloyd Morgan, III
  • Publication number: 20070242131
    Abstract: A wireless networked device incorporating a display, a video camera and a geo-location system receives geo-located data messages from a server system. Messages can be viewed by panning the device, revealing the message's real world location as icons and text overlaid on top of the camera input on the display. The user can reply to the message from her location, add data to an existing message at its original location, send new messages to other users of the system or place a message at a location for other users. World Wide Web geo-located data can be explored using the system's user interface as a browser. The server system uses the physical location of the receiving device to limit messages and data sent to each device according to range and filtering criteria, and can determine line of sight between the device and each actual message to simulate occlusion effects.
    Type: Application
    Filed: December 29, 2006
    Publication date: October 18, 2007
    Inventors: Ignacio Sanz-Pastor, David Morgan, Javier Castellar
  • Publication number: 20070236516
    Abstract: Sensor independent display characterization system spectrally characterizes a display system to measure radiant power emitted by the display system that displays a video image to a trainee pilot during sensor stimulation. A sensor spectral response for each wavelength produced by the stimulated sensor is determined. A stimulated luminance for each color level of the displayed image or for a range of color levels is computed. A color look up table that maps computed stimulated luminance to a set of stimulating color values is generated. When a trainee pilot looks at the displayed image using a sensor having a sensor response that was used in computing the stimulated luminance, the pilot will see an image that was created by simulated spectral rendering. The displayed image is an accurate, display and sensor independent image that the pilot can see during the real flight.
    Type: Application
    Filed: September 8, 2005
    Publication date: October 11, 2007
    Inventors: Javier Castellar, David Morgan
  • Patent number: 6735557
    Abstract: A set of specially-configured LUT's are used in a rasterizing portion of a graphics system for simulating Sensor-assisted Perception of Terrain (SaPOT) so that simulation of the image produced by a given sensor can proceed rapidly and with good accuracy at a per-texel level of resolution. More specifically, terrain texels-defining memory is provided with a plurality of addressable texel records where each record contains: (a) one or more material identification fields (MID's); (b) one or more mixture fields (MIX's) for defining mixture proportions for the materials; and (c) slope-defining data for defining a surface slope or normal of the corresponding texel. A sky-map LUT is provided for simulating the act of looking up to the sky along the normal surface vector of a given texel to thereby obtain a reading of the sky's contribution of illumination to that terrain texel.
    Type: Grant
    Filed: October 15, 1999
    Date of Patent: May 11, 2004
    Assignee: Aechelon Technology
    Inventors: Javier Castellar, Luis A. Barcena, Ignacio Sanz-Pastor, William P. McGovern
  • Publication number: 20020190997
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Application
    Filed: August 15, 2002
    Publication date: December 19, 2002
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
  • Patent number: 6445395
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Grant
    Filed: November 6, 2000
    Date of Patent: September 3, 2002
    Assignee: Microsoft Corporation
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
  • Patent number: 6249289
    Abstract: A high resolution distortion correction system is provided for an arbitrary projection system. First, a field of view is subdivided into multiple viewports. The multiple subdivided viewports provide a first approximation of the distortion. Polygons that are projected onto a particular subdivided viewport are rendered in a frame buffer and stored in texture memory as an intermediate texture image. The intermediate texture images are subsequently applied to a rendered distortion mesh to generate an output image.
    Type: Grant
    Filed: November 27, 1996
    Date of Patent: June 19, 2001
    Assignee: Silicon Graphics, Inc.
    Inventors: Remi Arnaud, Javier Castellar, Michael Timothy Jones
  • Patent number: 6163320
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Grant
    Filed: May 29, 1998
    Date of Patent: December 19, 2000
    Assignee: Silicon Graphics, Inc.
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar