Patents by Inventor Marko Palviainen

Marko Palviainen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11954789
    Abstract: Systems and methods are described for rendering a 3D synthetic scene. A display client receives point cloud samples of the scene, where the point cloud samples include a point location, one or more viewing directions, and color information for each of the viewing directions. The point cloud samples may be generated by a server using ray tracing. The display client combines information from the point cloud samples with a locally generated rendering of at least a portion of the scene to generate a combined rendering of the scene, and the client displays the combined rendering. The number of point cloud samples may be adjusted adaptively based on performance metrics at the client.
    Type: Grant
    Filed: July 2, 2020
    Date of Patent: April 9, 2024
    Assignee: InterDigital VC Holdings, Inc.
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20230377273
    Abstract: Systems and methods are described for mirroring content from a 2D display device to a 3D display device. Techniques are provided that include receiving information identifying a virtual 3D object and a first scale information, where the virtual 3D object is displayed on a 2D display device to a user using the first scale information. A second scale is then determined based on the first scale information, and a 3D representation of the virtual 3D object is displayed on a 3D display device using the second scale. Provided techniques further include receiving information identifying a first orientation information, where the virtual 3D object is further displayed using the first orientation information. A second orientation based on the first orientation information and based on a received user's location relative to the 3D display device is determined, and the 3D representation is further displayed using the second orientation.
    Type: Application
    Filed: July 25, 2023
    Publication date: November 23, 2023
    Inventors: Tatu V J HARVIAINEN, Marko PALVIAINEN
  • Patent number: 11741673
    Abstract: Systems and methods are described for mirroring 3D content from a first display (which may be a handheld 2D display) to a second display (which may be a 3D display, such as a light field display). The 3D content is initially displayed (e.g. as a 2D projection) on a the first display. The relative positions and/or orientations of the first and second displays are determined. The position of a user viewing the content may also be determined or may be inferred from the position and orientation of the first display. The second display is provided with information used to display the 3D content with a size and/or orientation that preserves the original apparent size and/or apparent orientation of that content from the perspective of the user.
    Type: Grant
    Filed: November 26, 2019
    Date of Patent: August 29, 2023
    Assignee: InterDigital Madison Patent Holdings, SAS
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20230032824
    Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.
    Type: Application
    Filed: October 14, 2022
    Publication date: February 2, 2023
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20220394355
    Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position.
    Type: Application
    Filed: August 12, 2022
    Publication date: December 8, 2022
    Inventors: Tatu V J HARVIAINEN, Marko PALVIAINEN
  • Patent number: 11493999
    Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.
    Type: Grant
    Filed: April 26, 2019
    Date of Patent: November 8, 2022
    Assignee: PMCS HOLDINGS, INC.
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20220351456
    Abstract: Systems and methods are described for rendering a 3D synthetic scene. A display client receives point cloud samples of the scene, where the point cloud samples include a point location, one or more viewing directions, and color information for each of the viewing directions. The point cloud samples may be generated by a server using ray tracing. The display client combines information from the point cloud samples with a locally generated rendering of at least a portion of the scene to generate a combined rendering of the scene, and the client displays the combined rendering. The number of point cloud samples may be adjusted adaptively based on performance metrics at the client.
    Type: Application
    Filed: July 2, 2020
    Publication date: November 3, 2022
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Patent number: 11451881
    Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position. Example systems and methods may be implemented using a head-mounted display (HMD).
    Type: Grant
    Filed: December 12, 2018
    Date of Patent: September 20, 2022
    Assignee: InterDigital Madison Patent Holdings, SAS
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20220068023
    Abstract: Systems and methods are described for mirroring 3D content from a first display (which may be a handheld 2D display) to a second display (which may be a 3D display, such as a light field display). The 3D content is initially displayed (e.g. as a 2D projection) on a the first display. The relative positions and/or orientations of the first and second displays are determined. The position of a user viewing the content may also be determined or may be inferred from the position and orientation of the first display. The second display is provided with information used to display the 3D content with a size and/or orientation that preserves the original apparent size and/or apparent orientation of that content from the perspective of the user.
    Type: Application
    Filed: November 26, 2019
    Publication date: March 3, 2022
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Patent number: 11202051
    Abstract: Motion parallax effects can be emulated for 3D video content. At a head mounted display (HMD), motion parallax may be emulated through: receiving a 3D video at the HMD; obtaining a model of an object in the 3D video; obtaining a processed 3D video in which the 3D video is processed to remove the object from the 3D video; tracking a change in position of the HMD by a sensor of the HMD; rendering the processed 3D video at the HMD; and rendering the model of the object at a position in the processed 3D video based on the tracked change in position of the HMD. Multilayer spherical video, indicating the depths of objects therein, which may be used for motion parallax emulation may also be generated.
    Type: Grant
    Filed: May 11, 2018
    Date of Patent: December 14, 2021
    Assignee: PCMS Holdings, Inc.
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20210240279
    Abstract: Some embodiments of a method may include tracking a distance between a local VR device and a remote VR device; selecting an experience-sharing mode from a plurality of experience-sharing modes based on the tracked distance; and providing a different degree of immersive user experience to a user at the local VR device depending on the selected experience-sharing mode, wherein as the distance between the local VR device and the remote VR device decreases, the experience-sharing mode changes and the degree of immersive user experience increases. Some embodiments of another method may include detecting a gesture made by a user; selecting an immersiveness mode from a plurality of immersiveness modes based on the detected gesture; and providing a degree of immersive user experience to a local VR device user depending on the selected immersiveness mode.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 5, 2021
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20210084278
    Abstract: Motion parallax effects can be emulated for 3D video content. At a head mounted display (HMD), motion parallax may be emulated through: receiving a 3D video at the HMD; obtaining a model of an object in the 3D video; obtaining a processed 3D video in which the 3D video is processed to remove the object from the 3D video; tracking a change in position of the HMD by a sensor of the HMD; rendering the processed 3D video at the HMD; and rendering the model of the object at a position in the processed 3D video based on the tracked change in position of the HMD. Multilayer spherical video, indicating the depths of objects therein, which may be used for motion parallax emulation may also be generated.
    Type: Application
    Filed: May 11, 2018
    Publication date: March 18, 2021
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20200336801
    Abstract: Systems and methods are described for displaying 360-degree video using a viewing direction and/or a viewpoint position that is determined based on a user-selected level of automation. If full automation is selected by the user, the viewing direction and/or viewport position is determined by a defined viewing path. If partial automation is selected, the viewing direction and/or viewport position is determined by user input in combination with the defined viewing path. For example, a high-pass-filtered version of the user input may be added to the defined viewing path to obtain a partially-automated viewing direction and/or viewpoint position. Example systems and methods may be implemented using a head-mounted display (HMD).
    Type: Application
    Filed: December 12, 2018
    Publication date: October 22, 2020
    Inventors: Tatu V. J. Harviainen, Marko Palviainen
  • Publication number: 20180254959
    Abstract: Methods, apparatus, and systems for performing any of generating, adapting, selecting, transferring, and displaying a user interface (UI) in a network are provided. A representative method of generating a user interface on a user interface device for operating a device on a network includes detecting a first predetermined user interaction with the user interface device indicative of a desire to use the user interface device to operate a device on a network, and responsive to detection of the first predetermined user interaction, generating by the user interface device a first user interface, wherein the first user interface is generated at a location relative to the user interface device based on the location of the detected first predetermined user interaction with the user interface device.
    Type: Application
    Filed: September 1, 2016
    Publication date: September 6, 2018
    Applicant: PCMS Holdings, Inc.
    Inventors: Jani Mantyjarvi, Jussi Ronkainen, Marko Palviainen, Markus Tuomikoski