Patents by Inventor Pal-Kristian Engstad
Pal-Kristian Engstad has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11823343Abstract: A method and device for modifying virtual content according to various simulation characteristics includes obtaining first virtual content; obtaining one or more simulation characteristics; and generating second virtual content by modifying the first virtual content according to the one or more simulation characteristics; and presenting the second virtual content.Type: GrantFiled: March 22, 2021Date of Patent: November 21, 2023Assignee: APPLE INC.Inventors: Dhruv Aditya Govil, Sabine Webel, Olivier Denis Roger Gutknecht, Shruti Singhal, Tobias Eble, Pal Kristian Engstad, Ivan Gavrenkov
-
Patent number: 11789688Abstract: Various implementations disclosed herein include devices, systems, and methods for presenting content based on environmental data. In various implementations, a device includes a display, a non-transitory memory and one or more processors coupled with the display and the non-transitory memory. In some implementations, a method includes obtaining environmental data associated with a physical environment in which the device is located. In some implementations, the method includes selecting, from a plurality of presentation modes, a first presentation mode for content based on the environmental data. In some implementations, the method includes presenting the content in accordance with the first presentation mode.Type: GrantFiled: February 10, 2021Date of Patent: October 17, 2023Assignee: APPLE INC.Inventors: Shruti Singhal, Pal Kristian Engstad, Tobias Eble, Norman Nuo Wang, Ivan Gavrenkov, Olivier Denis Roger Gutknecht, Sabine Webel
-
Patent number: 11694379Abstract: In one implementation, a method of displaying an animation is performed at a device including an optical see-through display, one or more processors, and a non-transitory memory. The method includes receiving a request to display a first animation of an object exhibiting a response characteristic. The method includes determining a metric characterizing an amount of processing power for the device to display the first animation on the optical see-through display. The method includes, in response to a determination that the metric exceeds a threshold associated with the device, selecting a second animation of the object exhibiting the response characteristic. The method includes displaying the second animation.Type: GrantFiled: January 28, 2021Date of Patent: July 4, 2023Assignee: APPLE INC.Inventors: Sabine Webel, Olivier Denis Roger Gutknecht, Pal Kristian Engstad, Ivan Gavrenkov, Tobias Eble, Shruti Singhal
-
Patent number: 11087430Abstract: Systems, methods, and computer readable media to data drive a render graph are described. A render graph system defines one or more nodes for a render graph and one or more render targets associated with the nodes. The nodes includes one or more functions to define and resolve target handles for identifying render targets. The render graph system defines one or more connections between the nodes and render targets. The connection between the nodes and render targets form the render graph. The render graph system stores the render graph as a data file and converts, with a render graphics API, the data file into a render graph data object. The render graph system performs a frame setup phase the setups the render graph for a frame based on the render graph data object.Type: GrantFiled: September 27, 2019Date of Patent: August 10, 2021Assignee: Apple Inc.Inventors: Cody J. White, Randal W. Lamore, Pal-Kristian Engstad, Ivan Gavrenkov, Matthew Stoll, Yang Zhou
-
Publication number: 20200104970Abstract: Systems, methods, and computer readable media to data drive a render graph are described. A render graph system defines one or more nodes for a render graph and one or more render targets associated with the nodes. The nodes includes one or more functions to define and resolve target handles for identifying render targets. The render graph system defines one or more connections between the nodes and render targets. The connection between the nodes and render targets form the render graph. The render graph system stores the render graph as a data file and converts, with a render graphics API, the data file into a render graph data object. The render graph system performs a frame setup phase the setups the render graph for a frame based on the render graph data object.Type: ApplicationFiled: September 27, 2019Publication date: April 2, 2020Inventors: Cody J. White, Randal W. Lamore, Pal-Kristian Engstad, Ivan Gavrenkov, Matthew Stoll, Yang Zhou
-
Patent number: 8174527Abstract: A system and method for environment mapping determines a computer-generated object's reflective appearance, based upon position and orientation of a camera with respect to the object's location. An embodiment of the present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula—r=e?(e·(n+eo))(n+eo)/(1?nz)=e?(e·[nx, ny, nz?1])[nx, ny, nz?1]/(1?nz), wherein eo=[0,0,?1], and nx, ny, and nz are the components of the surface normal vector n—to compute reflective properties of an object.Type: GrantFiled: July 23, 2010Date of Patent: May 8, 2012Assignee: Sony Computer Entertainment America LLCInventors: Mark Evan Cerny, Pal-Kristian Engstad
-
Patent number: 8149242Abstract: There is provided a graphics processing system that includes a main processing unit and a graphics processing unit (GPU). The main processing unit puts rendering commands generated using a graphics library in the queue of a command buffer in a main memory. In this process, the library function offered by the graphics library is converted into the rendering commands, without any rendering attributes retained in the library. The GPU reads and executes the rendering commands stacked in the command buffer, and generates rendering data in a frame buffer.Type: GrantFiled: October 30, 2007Date of Patent: April 3, 2012Assignee: Sony Computer Entertainment Inc.Inventors: Eric Langyel, Pal-Kristian Engstad, Mark Evan Cerny, Nathaniel Hoffman, Jon Olick, Motoi Kaneko, Yoshinori Washizu
-
Publication number: 20100283783Abstract: A system and method for environment mapping determines a computer-generated object's reflective appearance, based upon position and orientation of a camera with respect to the object's location. An embodiment of the present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula—r=e?(e·(n+eo))(n+eo)/(1?nz)=e?(e·[nx, ny, nz?1])[nx, ny, nz?1]/(1?nz), wherein eo=[0,0,?1], and nx, ny, and nz are the components of the surface normal vector n—to compute reflective properties of an object.Type: ApplicationFiled: July 23, 2010Publication date: November 11, 2010Inventors: Mark Evan Cerny, Pal-Kristian Engstad
-
Patent number: 7786993Abstract: A system and method for environment mapping determines a computer-generated object's reflective appearance, based upon position and orientation of a camera with respect to the object's location. An embodiment of the present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula—r=e?(e·(n+eo))(n+eo)/(1?nz)=e?(e·[nx, ny, nz?1])[nx, ny, nz?1]/(1?nz), wherein eo=[0,0,?1], and nx, ny, and nz are the components of the surface normal vector n—to compute reflective properties of an object.Type: GrantFiled: September 7, 2005Date of Patent: August 31, 2010Assignee: Sony Computer Entertainment America LLCInventors: Mark Evan Cerny, Pal-Kristian Engstad
-
Publication number: 20090002380Abstract: There is provided a graphics processing system that includes a main processing unit and a graphics processing unit (GPU). The main processing unit puts rendering commands generated using a graphics library in the queue of a command buffer in a main memory. In this process, the library function offered by the graphics library is converted into the rendering commands, without any rendering attributes retained in the library. The GPU reads and executes the rendering commands stacked in the command buffer, and generates rendering data in a frame buffer.Type: ApplicationFiled: October 30, 2007Publication date: January 1, 2009Applicant: SONY COMPUTER ENTERTAINMENT INC.Inventors: Eric Langyel, Pal-Kristian Engstad, Mark Evan Cerny, Nathaniel Hoffman, Jon Olick, Motoi Kaneko, Yoshinori Washizu
-
Patent number: 7046245Abstract: A system and method for environment mapping determines a computer-generated object's reflective appearance, based upon position and orientation of a camera with respect to the object's location. The present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula to compute reflective properties of an object. The modified reflection formula is: r=e?(e·(n+eo))(n+eo)/(1?nz)=e?(e·[nx, ny, nz?1])[nx, ny, nz?1]/(1?nz), where eo=[0,0,?1], and nx, ny, and nz are the components of the surface normal vector, n.Type: GrantFiled: October 8, 2002Date of Patent: May 16, 2006Assignee: Sony Computer Entertainment America Inc.Inventors: Mark Evan Cerny, Pal-Kristian Engstad
-
Publication number: 20060001674Abstract: A system and method for environment mapping determines a computer-generated object'reflective appearance, based upon position and orientation of a camera with respect to the object's location. An embodiment of the present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula—r=e?(e·(n+eo))(n+eo)/(1?nz)=e?(e·[nx, ny, nz?1])[nx, ny, nz?1]/(1?nz), wherein eo=[0,0,?1], and nx, ny, and nz are the components of the surface normal vector n—to compute reflective properties of an object.Type: ApplicationFiled: September 7, 2005Publication date: January 5, 2006Inventors: Mark Cerny, Pal-Kristian Engstad
-
Publication number: 20030112238Abstract: A system and method for environment mapping determines a computer-generated object's reflective appearance, based upon position and orientation of a camera with respect to the object's location. The present invention is implemented as a real-time environment mapping for polygon rendering, however, the scope of the invention covers other rendering schemes. According to one embodiment of the present invention, a vector processing unit (VPU) uses a modified reflection formula to compute reflective properties of an object.Type: ApplicationFiled: October 8, 2002Publication date: June 19, 2003Inventors: Mark Evan Cerny, Pal-Kristian Engstad