Patents by Inventor Marko Palviainen
Marko Palviainen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11954789Abstract: Systems and methods are described for rendering a 3D synthetic scene. A display client receives point cloud samples of the scene, where the point cloud samples include a point location, one or more viewing directions, and color information for each of the viewing directions. The point cloud samples may be generated by a server using ray tracing. The display client combines information from the point cloud samples with a locally generated rendering of at least a portion of the scene to generate a combined rendering of the scene, and the client displays the combined rendering. The number of point cloud samples may be adjusted adaptively based on performance metrics at the client.Type: GrantFiled: July 2, 2020Date of Patent: April 9, 2024Assignee: InterDigital VC Holdings, Inc.Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20230377273Abstract: Systems and methods are described for mirroring content from a 2D display device to a 3D display device. Techniques are provided that include receiving information identifying a virtual 3D object and a first scale information, where the virtual 3D object is displayed on a 2D display device to a user using the first scale information. A second scale is then determined based on the first scale information, and a 3D representation of the virtual 3D object is displayed on a 3D display device using the second scale. Provided techniques further include receiving information identifying a first orientation information, where the virtual 3D object is further displayed using the first orientation information. A second orientation based on the first orientation information and based on a received user's location relative to the 3D display device is determined, and the 3D representation is further displayed using the second orientation.Type: ApplicationFiled: July 25, 2023Publication date: November 23, 2023Inventors: Tatu V J HARVIAINEN, Marko PALVIAINEN
-
Patent number: 11741673Abstract: Systems and methods are described for mirroring 3D content from a first display (which may be a handheld 2D display) to a second display (which may be a 3D display, such as a light field display). The 3D content is initially displayed (e.g. as a 2D projection) on a the first display. The relative positions and/or orientations of the first and second displays are determined. The position of a user viewing the content may also be determined or may be inferred from the position and orientation of the first display. The second display is provided with information used to display the 3D content with a size and/or orientation that preserves the original apparent size and/or apparent orientation of that content from the perspective of the user.Type: GrantFiled: November 26, 2019Date of Patent: August 29, 2023Assignee: InterDigital Madison Patent Holdings, SASInventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20230032824Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.Type: ApplicationFiled: October 14, 2022Publication date: February 2, 2023Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20220394355Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position.Type: ApplicationFiled: August 12, 2022Publication date: December 8, 2022Inventors: Tatu V J HARVIAINEN, Marko PALVIAINEN
-
Patent number: 11493999Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.Type: GrantFiled: April 26, 2019Date of Patent: November 8, 2022Assignee: PMCS HOLDINGS, INC.Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20220351456Abstract: Systems and methods are described for rendering a 3D synthetic scene. A display client receives point cloud samples of the scene, where the point cloud samples include a point location, one or more viewing directions, and color information for each of the viewing directions. The point cloud samples may be generated by a server using ray tracing. The display client combines information from the point cloud samples with a locally generated rendering of at least a portion of the scene to generate a combined rendering of the scene, and the client displays the combined rendering. The number of point cloud samples may be adjusted adaptively based on performance metrics at the client.Type: ApplicationFiled: July 2, 2020Publication date: November 3, 2022Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
Patent number: 11451881Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position. Example systems and methods may be implemented using a head-mounted display (HMD).Type: GrantFiled: December 12, 2018Date of Patent: September 20, 2022Assignee: InterDigital Madison Patent Holdings, SASInventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20220068023Abstract: Systems and methods are described for mirroring 3D content from a first display (which may be a handheld 2D display) to a second display (which may be a 3D display, such as a light field display). The 3D content is initially displayed (e.g. as a 2D projection) on a the first display. The relative positions and/or orientations of the first and second displays are determined. The position of a user viewing the content may also be determined or may be inferred from the position and orientation of the first display. The second display is provided with information used to display the 3D content with a size and/or orientation that preserves the original apparent size and/or apparent orientation of that content from the perspective of the user.Type: ApplicationFiled: November 26, 2019Publication date: March 3, 2022Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
System and method for distributing and rendering content as spherical video and 3D asset combination
Patent number: 11202051Abstract: Motion parallax effects can be emulated for 3D video content. At a head mounted display (HMD), motion parallax may be emulated through: receiving a 3D video at the HMD; obtaining a model of an object in the 3D video; obtaining a processed 3D video in which the 3D video is processed to remove the object from the 3D video; tracking a change in position of the HMD by a sensor of the HMD; rendering the processed 3D video at the HMD; and rendering the model of the object at a position in the processed 3D video based on the tracked change in position of the HMD. Multilayer spherical video, indicating the depths of objects therein, which may be used for motion parallax emulation may also be generated.Type: GrantFiled: May 11, 2018Date of Patent: December 14, 2021Assignee: PCMS Holdings, Inc.Inventors: Tatu V. J. Harviainen, Marko Palviainen -
Publication number: 20210240279Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.Type: ApplicationFiled: April 26, 2019Publication date: August 5, 2021Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
SYSTEM AND METHOD FOR DISTRIBUTING AND RENDERING CONTENT AS SPHERICAL VIDEO AND 3D ASSET COMBINATION
Publication number: 20210084278Abstract: Motion parallax effects can be emulated for 3D video content. At a head mounted display (HMD), motion parallax may be emulated through: receiving a 3D video at the HMD; obtaining a model of an object in the 3D video; obtaining a processed 3D video in which the 3D video is processed to remove the object from the 3D video; tracking a change in position of the HMD by a sensor of the HMD; rendering the processed 3D video at the HMD; and rendering the model of the object at a position in the processed 3D video based on the tracked change in position of the HMD. Multilayer spherical video, indicating the depths of objects therein, which may be used for motion parallax emulation may also be generated.Type: ApplicationFiled: May 11, 2018Publication date: March 18, 2021Inventors: Tatu V. J. Harviainen, Marko Palviainen -
Publication number: 20200336801Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position. Example systems and methods may be implemented using a head-mounted display (HMD).Type: ApplicationFiled: December 12, 2018Publication date: October 22, 2020Inventors: Tatu V. J. Harviainen, Marko Palviainen
-
Publication number: 20180254959Abstract: Methods, apparatus, and systems for performing any of generating, adapting, selecting, transferring, and displaying a user interface (UI) in a network are provided. A representative method of generating a user interface on a user interface device for operating a device on a network includes detecting a first predetermined user interaction with the user interface device indicative of a desire to use the user interface device to operate a device on a network, and responsive to detection of the first predetermined user interaction, generating by the user interface device a first user interface, wherein the first user interface is generated at a location relative to the user interface device based on the location of the detected first predetermined user interaction with the user interface device.Type: ApplicationFiled: September 1, 2016Publication date: September 6, 2018Applicant: PCMS Holdings, Inc.Inventors: Jani Mantyjarvi, Jussi Ronkainen, Marko Palviainen, Markus Tuomikoski