Patents by Inventor Javier Castellar
Javier Castellar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 8706719Abstract: A user's physical location and the time the user is at that location are sampled by an network-enabled mobile computing device at certain intervals, and used to construct a representation of the comparative relevance of the different physical locations where the user lives. This representation is then used then to optimize and prioritize results returned by a local network search operation, informing the user about which search result would be optimal for the user at her intended time for use of that choice.Type: GrantFiled: August 10, 2010Date of Patent: April 22, 2014Assignee: Aechelon Technology, Inc.Inventors: Ignacio Sanz-Pastor, David L. Morgan, Javier Castellar, Luis A. Barcena, Christopher E. Blumenthal
-
Patent number: 8436855Abstract: Efficient determination of illumination over large 3D environments, including shadowing, is provided. Illumination, including shadows, is generated using a raster elevation map by a lighting solver. The lighting solver fetches the raster elevation map for an illumination area of interest at the paging rate and produces an illumination map that is applied to terrain and features by a 3D renderer. The lighting solver updates subsets of the illumination map as necessary to reflect changing illumination or movement of the visual area of interest.Type: GrantFiled: February 19, 2008Date of Patent: May 7, 2013Assignee: Aechelon Technology, Inc.Inventors: David L. Morgan, Ignacio Sanz-Pastor, Javier Castellar
-
Patent number: 8280405Abstract: A wireless networked device incorporating a display, a video camera and a geo-location system receives geo-located data messages from a server system. Messages can be viewed by panning the device, revealing the message's real world location as icons and text overlaid on top of the camera input on the display. The user can reply to the message from her location, add data to an existing message at its original location, send new messages to other users of the system or place a message at a location for other users. World Wide Web geo-located data can be explored using the system's user interface as a browser. The server system uses the physical location of the receiving device to limit messages and data sent to each device according to range and filtering criteria, and can determine line of sight between the device and each actual message to simulate occlusion effects.Type: GrantFiled: December 29, 2006Date of Patent: October 2, 2012Assignee: Aechelon Technology, Inc.Inventors: Ignacio Sanz-Pastor, David L. Morgan, III, Javier Castellar
-
Patent number: 8203503Abstract: Sensor independent display characterization system spectrally characterizes a display system to measure radiant power emitted by the display system that displays a video image to a trainee pilot during sensor stimulation. A sensor spectral response for each wavelength produced by the stimulated sensor is determined. A stimulated luminance for each color level of the displayed image or for a range of color levels is computed. A color look up table that maps computed stimulated luminance to a set of stimulating color values is generated. When a trainee pilot looks at the displayed image using a sensor having a sensor response that was used in computing the stimulated luminance, the pilot will see an image that was created by simulated spectral rendering. The displayed image is an accurate, display and sensor independent image that the pilot can see during the real flight.Type: GrantFiled: September 8, 2005Date of Patent: June 19, 2012Assignee: Aechelon Technology, Inc.Inventors: Javier Castellar, David Lloyd Morgan, III
-
Publication number: 20070242131Abstract: A wireless networked device incorporating a display, a video camera and a geo-location system receives geo-located data messages from a server system. Messages can be viewed by panning the device, revealing the message's real world location as icons and text overlaid on top of the camera input on the display. The user can reply to the message from her location, add data to an existing message at its original location, send new messages to other users of the system or place a message at a location for other users. World Wide Web geo-located data can be explored using the system's user interface as a browser. The server system uses the physical location of the receiving device to limit messages and data sent to each device according to range and filtering criteria, and can determine line of sight between the device and each actual message to simulate occlusion effects.Type: ApplicationFiled: December 29, 2006Publication date: October 18, 2007Inventors: Ignacio Sanz-Pastor, David Morgan, Javier Castellar
-
Publication number: 20070236516Abstract: Sensor independent display characterization system spectrally characterizes a display system to measure radiant power emitted by the display system that displays a video image to a trainee pilot during sensor stimulation. A sensor spectral response for each wavelength produced by the stimulated sensor is determined. A stimulated luminance for each color level of the displayed image or for a range of color levels is computed. A color look up table that maps computed stimulated luminance to a set of stimulating color values is generated. When a trainee pilot looks at the displayed image using a sensor having a sensor response that was used in computing the stimulated luminance, the pilot will see an image that was created by simulated spectral rendering. The displayed image is an accurate, display and sensor independent image that the pilot can see during the real flight.Type: ApplicationFiled: September 8, 2005Publication date: October 11, 2007Inventors: Javier Castellar, David Morgan
-
Patent number: 6735557Abstract: A set of specially-configured LUT's are used in a rasterizing portion of a graphics system for simulating Sensor-assisted Perception of Terrain (SaPOT) so that simulation of the image produced by a given sensor can proceed rapidly and with good accuracy at a per-texel level of resolution. More specifically, terrain texels-defining memory is provided with a plurality of addressable texel records where each record contains: (a) one or more material identification fields (MID's); (b) one or more mixture fields (MIX's) for defining mixture proportions for the materials; and (c) slope-defining data for defining a surface slope or normal of the corresponding texel. A sky-map LUT is provided for simulating the act of looking up to the sky along the normal surface vector of a given texel to thereby obtain a reading of the sky's contribution of illumination to that terrain texel.Type: GrantFiled: October 15, 1999Date of Patent: May 11, 2004Assignee: Aechelon TechnologyInventors: Javier Castellar, Luis A. Barcena, Ignacio Sanz-Pastor, William P. McGovern
-
Publication number: 20020190997Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.Type: ApplicationFiled: August 15, 2002Publication date: December 19, 2002Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
-
Patent number: 6445395Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.Type: GrantFiled: November 6, 2000Date of Patent: September 3, 2002Assignee: Microsoft CorporationInventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar
-
Patent number: 6249289Abstract: A high resolution distortion correction system is provided for an arbitrary projection system. First, a field of view is subdivided into multiple viewports. The multiple subdivided viewports provide a first approximation of the distortion. Polygons that are projected onto a particular subdivided viewport are rendered in a frame buffer and stored in texture memory as an intermediate texture image. The intermediate texture images are subsequently applied to a rendered distortion mesh to generate an output image.Type: GrantFiled: November 27, 1996Date of Patent: June 19, 2001Assignee: Silicon Graphics, Inc.Inventors: Remi Arnaud, Javier Castellar, Michael Timothy Jones
-
Patent number: 6163320Abstract: A method and apparatus for rendering lightpoints is provided. For the method of the present invention, a programmer creates a series of texture maps. Each texture map approximates the lobe of a lightpoint at a respective distance from the lightpoint. Each texture map includes transparency texture information. This allows the lightpoint to correctly model fog and other atmospheric conditions. The series of texture maps are encoded in a mipmap associated with the lightpoint. During use, a simulation environment renders the lightpoint using a billboarding technique. The billboarding technique keeps the lobe of the lightpoint oriented towards the eye point. The simulation environment dynamically tracks the distance from the lightpoint to the eye point. Each time the distance changes, the simulation environment selects an appropriate texture map from the mipmap. The appropriate texture map is the texture map that correctly depicts the lightpoint at the distance between the eye point and the lightpoint.Type: GrantFiled: May 29, 1998Date of Patent: December 19, 2000Assignee: Silicon Graphics, Inc.Inventors: Luis A. Barcena, Nacho Sanz-Pastor, Javier Castellar