Patents by Inventor Walter J. LUH

Walter J. LUH has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240371107
    Abstract: Aspects of the present disclosure are directed to a host renderer for artificial reality system(s) that provides dynamic rendering for application(s). Implementation of the host renderer decouple rendering of content from content source(s) to improve compatibility, extensibility, processing efficiency, and other aspects of content rendering. An artificial reality application can generate a scene graph with scene components, or renderable/drawable elements of the scene graph. The host renderer is configured to receive an encoded version of the artificial reality application's scene graph and issue processor rendering calls to render the drawable/renderable components of the scene graph. The host renderer abstracts the hardware level rendering calls and provides the artificial reality application access to hardware rendering via the host renderer. Implementations of the host renderer can perform rendering optimizations and issue a diverse set of processor rendering calls to diverse hardware.
    Type: Application
    Filed: May 4, 2023
    Publication date: November 7, 2024
    Inventors: Walter J. LUH, Cameron SYLVIA, Alexey MEDVEDEV, Eric GRIFFITH
  • Publication number: 20240371069
    Abstract: Aspects of the present disclosure are directed to a host renderer for artificial reality system(s) that provides dynamic rendering for application(s). Implementation of the host renderer decouple rendering of content from content source(s) to improve compatibility, extensibility, processing efficiency, and other aspects of content rendering. An artificial reality application can generate a scene graph with scene components, or renderable/drawable elements of the scene graph. The host renderer is configured to receive an encoded version of the artificial reality application's scene graph and issue processor rendering calls to render the drawable/renderable components of the scene graph. The host renderer abstracts the hardware level rendering calls and provides the artificial reality application access to hardware rendering via the host renderer. Implementations of the host renderer can perform rendering optimizations and issue a diverse set of processor rendering calls to diverse hardware.
    Type: Application
    Filed: May 4, 2023
    Publication date: November 7, 2024
    Inventors: Walter J. LUH, Cameron SYLVIA, Alexey MEDVEDEV, Eric GRIFFITH
  • Publication number: 20240371109
    Abstract: Aspects of the present disclosure are directed to a host renderer for artificial reality system(s) that provides dynamic rendering for application(s). Implementation of the host renderer decouple rendering of content from content source(s) to improve compatibility, extensibility, processing efficiency, and other aspects of content rendering. An artificial reality application can generate a scene graph with scene components, or renderable/drawable elements of the scene graph. The host renderer is configured to receive an encoded version of the artificial reality application's scene graph and issue processor rendering calls to render the drawable/renderable components of the scene graph. The host renderer abstracts the hardware level rendering calls and provides the artificial reality application access to hardware rendering via the host renderer. Implementations of the host renderer can perform rendering optimizations and issue a diverse set of processor rendering calls to diverse hardware.
    Type: Application
    Filed: May 4, 2023
    Publication date: November 7, 2024
    Inventors: Walter J. LUH, Cameron SYLVIA, Alexey MEDVEDEV, Eric GRIFFITH
  • Publication number: 20240371108
    Abstract: Aspects of the present disclosure are directed to a host renderer for artificial reality system(s) that provides dynamic rendering for application(s). Implementation of the host renderer decouple rendering of content from content source(s) to improve compatibility, extensibility, processing efficiency, and other aspects of content rendering. An artificial reality application can generate a scene graph with scene components, or renderable/drawable elements of the scene graph. The host renderer is configured to receive an encoded version of the artificial reality application's scene graph and issue processor rendering calls to render the drawable/renderable components of the scene graph. The host renderer abstracts the hardware level rendering calls and provides the artificial reality application access to hardware rendering via the host renderer. Implementations of the host renderer can perform rendering optimizations and issue a diverse set of processor rendering calls to diverse hardware.
    Type: Application
    Filed: May 4, 2023
    Publication date: November 7, 2024
    Inventors: Walter J. LUH, Cameron SYLVIA, Alexey MEDVEDEV, Eric GRIFFITH
  • Publication number: 20240273824
    Abstract: Aspects of the present disclosure are directed to an integration framework for two-dimensional (2D) and three-dimensional (3D) elements in an artificial reality (XR) environment. The 2D and 3D integration framework can implement a two-layered application programming interface (API) system, where a developer can use a declarative API to define nodes by executing pre-defined functions, and an imperative API defines to define node by specifying one or more functions for those nodes. The framework can traverse a component tree of such nodes to extract and add the 2D elements onto a 2D panel in a first pass. In a second pass, the framework can extract the 3D elements, and determine how the 2D and 3D elements translate into a 3D world view. Based on this determination, the framework can draw selected 2D and 3D elements into the 3D world view, which can be rendered in the XR environment on an XR device.
    Type: Application
    Filed: February 10, 2023
    Publication date: August 15, 2024
    Inventors: Rohan MEHTA, Walter J. LUH, Eric GRIFFITH, Zeya PENG, Lucas SWITZER, Dalton Thorn FLANAGAN
  • Publication number: 20240192973
    Abstract: In some implementations, the disclosed systems and methods can compute the position of the user interface elements in three-dimensional space, project the position into two-dimensional camera space, and dynamically determine the scale of the user interface elements per frame as either static or dynamic based on the distance between the user and the virtual object. In some implementations, the disclosed systems and methods can simulate a three-dimensional (3D) XR experience on the 2D interface (such as a laptop, mobile phone, tablet, etc.), by placing a virtual XR head-mounted display (HMD) into the XR environment, and providing feeds of the field-of view of the virtual XR HMD to the 2D interface.
    Type: Application
    Filed: February 23, 2024
    Publication date: June 13, 2024
    Applicant: Meta Platforms Technologies, LLC
    Inventors: Moisés Ferrer SERRA, Mykyta LUTSENKO, Matthew James GALLOWAY, Walter J. LUH, Christopher LAW, Artur KUSHKA, Jean-Francois MULE, David TEITLEBAUM, Roman LESHCHINSKIY, Dalton Thorn FLANAGAN, Dony GEORGE, David Michael WOODWARD