Patents by Inventor Luis A. Barcena

Luis A. Barcena has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8706719
    Abstract: A user's physical location and the time the user is at that location are sampled by an network-enabled mobile computing device at certain intervals, and used to construct a representation of the comparative relevance of the different physical locations where the user lives. This representation is then used then to optimize and prioritize results returned by a local network search operation, informing the user about which search result would be optimal for the user at her intended time for use of that choice.
    Type: Grant
    Filed: August 10, 2010
    Date of Patent: April 22, 2014
    Assignee: Aechelon Technology, Inc.
    Inventors: Ignacio Sanz-Pastor, David L. Morgan, Javier Castellar, Luis A. Barcena, Christopher E. Blumenthal
  • Patent number: 6735557
    Abstract: A set of specially-configured LUT's are used in a rasterizing portion of a graphics system for simulating Sensor-assisted Perception of Terrain (SaPOT) so that simulation of the image produced by a given sensor can proceed rapidly and with good accuracy at a per-texel level of resolution. More specifically, terrain texels-defining memory is provided with a plurality of addressable texel records where each record contains: (a) one or more material identification fields (MID's); (b) one or more mixture fields (MIX's) for defining mixture proportions for the materials; and (c) slope-defining data for defining a surface slope or normal of the corresponding texel. A sky-map LUT is provided for simulating the act of looking up to the sky along the normal surface vector of a given texel to thereby obtain a reading of the sky's contribution of illumination to that terrain texel.
    Type: Grant
    Filed: October 15, 1999
    Date of Patent: May 11, 2004
    Assignee: Aechelon Technology
    Inventors: Javier Castellar, Luis A. Barcena, Ignacio Sanz-Pastor, William P. McGovern
  • Publication number: 20020190997
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Application
    Filed: August 15, 2002
    Publication date: December 19, 2002
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
  • Patent number: 6445395
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Grant
    Filed: November 6, 2000
    Date of Patent: September 3, 2002
    Assignee: Microsoft Corporation
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
  • Patent number: 6268861
    Abstract: A method and apparatus for volumetric three-dimensional fog rendering is provided. To add fog effects to an image, a host processor computes the location of the eye-point relative to the image to be fogged. Using the eye-point location, the host processor generates a three-dimensional fog texture and a blending function. The three-dimensional fog texture and blending function are downloaded or otherwise passed by the host processor to the graphics processor. The graphics processor then renders the primitives that make up the image. When rendering is complete, the graphics processor applies the tree-dimensional fog texture in an additional rendering pass. The method may then be repeated, to create animated fog effects such as swirling or wind-driven fog.
    Type: Grant
    Filed: August 25, 1998
    Date of Patent: July 31, 2001
    Assignee: Silicon Graphics, Incorporated
    Inventors: Nacho Sanz-Pastor, Luis A. Barcena
  • Patent number: 6163320
    Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.
    Type: Grant
    Filed: May 29, 1998
    Date of Patent: December 19, 2000
    Assignee: Silicon Graphics, Inc.
    Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar