Patents Assigned to Varjo Technologies Oy
  • Patent number: 11863788
    Abstract: An encoder for encoding images includes at least one processor configured transform a given input pixel of an input image having (x, y) coordinates in a Cartesian coordinate system into a given transformed pixel of a transformed image having (?, ?) coordinates in a log-polar coordinate system, using a log-polar transformation in which a radial distance (?) of the given transformed pixel is a logarithm of a distance of the given input pixel from an origin in the Cartesian coordinate system, and an angular distance (?) of the given transformed pixel is a sum of an arctangent of a slope of a line connecting the given input pixel to the origin and a function of the radial distance, encode the transformed image, by employing a compression algorithm, into an encoded image, and send the encoded image to a display apparatus for subsequent decoding thereat.
    Type: Grant
    Filed: April 8, 2022
    Date of Patent: January 2, 2024
    Assignee: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Ville Miettinen
  • Patent number: 11863786
    Abstract: A method of transmitting image data in an image display system, includes dividing the image data into framebuffers, and for each framebuffer: dividing the framebuffer into a number of vertical stripes, each stripe including one or more scanlines, dividing each vertical stripe into at least a first and a second block, each of the first and the second block comprising pixel data to be displayed in an area of the image, and storing first pixel data in the first block with a first resolution and second pixel data in the second block having a second resolution which is lower than the first resolution, transmitting the framebuffer over the digital display interface to a decoder unit, and unpacking the framebuffer, including upscaling the pixel data in the second block to compensate for the lower second resolution and optionally upscaling the pixel data in the first block.
    Type: Grant
    Filed: May 21, 2021
    Date of Patent: January 2, 2024
    Assignee: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Oiva Arvo Oskari Sahlsten, Ville Miettinen
  • Publication number: 20230412932
    Abstract: An imaging system includes first camera having negative distortion; second camera, second field of view of second camera being wider than first field of view of first camera, wherein first field of view fully overlaps with portion of second field of view, second camera having negative distortion at said portion and positive distortion at remaining portion; and processor(s) configured to: capture first image and second image; determine overlapping image segment and non-overlapping image segment of second image; and generate output image from first image and second image, wherein: inner image segment of output image is generated from at least one of: first image, overlapping image segment, and peripheral image segment of output image is generated from non-overlapping image segment.
    Type: Application
    Filed: May 20, 2022
    Publication date: December 21, 2023
    Applicant: Varjo Technologies Oy
    Inventor: Mikko Ollila
  • Publication number: 20230379594
    Abstract: An imaging system includes first camera; second camera, second field of view of second camera being wider than first field of view of first camera, wherein first field of view overlaps with portion of second field of view; and processor(s) configured to: capture first images and second images, wherein overlapping image segment and non-overlapping image segment of second image correspond to said portion and remaining portion of second field of view; determine blurred region(s) (B1, B2) of first image; and generate output image in manner that: inner image segment of output image is generated from: region(s) of overlapping image segment that corresponds to blurred region(s) of first image, and remaining region of first image that is not blurred, and peripheral image segment of output image is generated from non-overlapping image segment.
    Type: Application
    Filed: May 20, 2022
    Publication date: November 23, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Mikko Ollila, Petteri Timonen
  • Publication number: 20230336944
    Abstract: Disclosed is a computer-implemented method comprising: tracking positions and orientations of devices (104, 106, 204a-204f, A-F, 402, 404) within real-world environment (300), each device comprising active sensor(s) (108, 110, 206a-206f); classifying devices into groups, based on positions and orientations of devices within real-world environment, wherein a group has devices whose active sensors are likely to interfere with each other; and controlling active sensors of devices in the group to operate by employing multiplexing.
    Type: Application
    Filed: April 14, 2022
    Publication date: October 19, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Ville Timonen, Mika-Petteri Lundgren
  • Publication number: 20230334691
    Abstract: Disclosed is a system (100) comprising devices (102 a, 102 b, 200, 300, 304), each device comprising active illuminator, active sensor and processor, wherein processor (108 a) of device (102 a, 300) is configured to: project pattern of light onto surroundings (302), whilst detect reflections of pattern of light off onto surroundings; determine shapes of surfaces present in surroundings and distances of surfaces from pose of device; obtain pattern information indicative of other pattern(s) of light projected by other active illuminator(s) (104 b) of other device(s) (102 b, 304); detect reflections of other pattern(s) from pose of device; determine relative pose of other device(s) with respect to pose of device; and send, to server (110, 208), surface information indicative of shapes and distances of surfaces from pose of device, along with pose information indicative of relative pose of other device(s) with respect to pose of device.
    Type: Application
    Filed: April 14, 2022
    Publication date: October 19, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Ari Antti Erik Peuhkurinen, Urho Konttori
  • Publication number: 20230328283
    Abstract: An encoder for encoding images includes at least one processor configured transform a given input pixel of an input image having (x, y) coordinates in a Cartesian coordinate system into a given transformed pixel of a transformed image having (?, ?) coordinates in a log-polar coordinate system, using a log-polar transformation in which a radial distance (? of the given transformed pixel is a logarithm of a distance of the given input pixel from an origin in the Cartesian coordinate system, and an angular distance (?) of the given transformed pixel is a sum of an arctangent of a slope of a line connecting the given input pixel to the origin and a function of the radial distance, encode the transformed image, by employing a compression algorithm, into an encoded image, and send the encoded image to a display apparatus for subsequent decoding thereat.
    Type: Application
    Filed: April 8, 2022
    Publication date: October 12, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Ville Miettinen
  • Publication number: 20230326074
    Abstract: Disclosed is a system (100, 200) comprising server (102, 202, 308) and data repository (104, 204, 310) storing three-dimensional (3D) environment model, wherein server is configured to: receive, from client device (106, 206, 300), first image(s) of real-world environment captured by camera(s) (108, 208, 302) of client device, along with information indicative of first measured pose of client device measured by pose-tracking means (110, 210, 304) of client device; utilise 3D environment model to generate first reconstructed image(s) from perspective of first measured pose; determine first spatial transformation indicative of difference in first measured pose and first actual pose of client device; calculate first actual pose, based on first measured pose and first spatial transformation; and send information indicative of at least one of: first actual pose, first spatial transformation, to client device for enabling client device to calculate subsequent actual poses.
    Type: Application
    Filed: April 8, 2022
    Publication date: October 12, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Pekka Väänänen, Petteri Timonen
  • Publication number: 20230316577
    Abstract: A method implemented by a server communicably coupled to at least two devices, each device including camera(s, the devices being present within same real-world environment . The method includes: receiving, from the devices(s), images captured by respective cameras of the devices; identifying one of the devices whose camera has camera parameter(s) better than camera parameter(s) of camera of another of the devices; training neural network using images captured by camera of one of the devices as ground truth material and using images captured by camera of another of the devices as training material; generating correction information to correct images captured by camera of another of the devices using trained neural network; and correcting the images captured by the camera of the another of the device(s) by utilising the correction information at the server, or sending correction information to another of the devices for correcting the images.
    Type: Application
    Filed: March 22, 2022
    Publication date: October 5, 2023
    Applicant: Varjo Technologies Oy
    Inventor: Mikko Ollila
  • Patent number: 11758077
    Abstract: Disclosed is a system (100) comprising server(s) (102) communicably coupled to devices (104a, 104b, 202a-202c). Server(s) is/are configured to: receive depth information generated by said device using active illuminator and active sensor; detect when at least two devices are in same surroundings, based on at least partial matching of depth information received from at least two devices; and send instructions to at least two devices to generate new depth information by: controlling active illuminator (106a) of one of at least two devices (104a, 202a) to project sequence of light patterns on same surroundings, whilst switching off active illuminator (106b) of another of at least two devices (104b, 202b), and controlling first active sensor (108a) of one of at least two devices and second active sensor (108b) of another of at least two devices to sense simultaneously reflections of sequence of light patterns.
    Type: Grant
    Filed: April 14, 2022
    Date of Patent: September 12, 2023
    Assignee: Varjo Technologies Oy
    Inventor: Mikko Ollila
  • Patent number: 11727658
    Abstract: A system including server(s) configured to: receive, from host device, visible-light images of real-world environment captured by visible-light camera(s); process visible-light images to generate three-dimensional (3D) environment model; receive, from client device, information indicative of pose of client device; utilise 3D environment model to generate reconstructed image(s) and reconstructed depth map(s); determine position of each pixel of reconstructed image(s); receive, from host device, current visible-light image(s); receive, from host device, information indicative of current pose of host device, or determine said current pose; determine, for pixel of reconstructed image(s), whether or not corresponding pixel exists in current visible-light image(s); replace initial pixel values of pixel in reconstructed image(s) with pixel values of corresponding pixel in current visible-light image(s), when corresponding pixel exists in current visible-light image(s); and send reconstructed image(s) to client devic
    Type: Grant
    Filed: October 1, 2021
    Date of Patent: August 15, 2023
    Assignee: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Petteri Timonen
  • Publication number: 20230251744
    Abstract: A system including: user device(s) providing user with user interface; input device; and sensor(s). Processor of user device(s) is configured to: identify surface of object and determine pose of user's hand and input device; determine first distance and second distance; determine whether input device is in contact with: user's hand, surface of object, or both user's hand and surface. When input device is in contact with surface or both user's hand and surface, processor is configured to control input device to operate in first mode. When input device is in contact with user's hand, processor is configured to control input device to operate in second mode. System enables user to interact with user interface by operating input device as computer mouse during first mode and as six-degrees-of-freedom controller during second mode.
    Type: Application
    Filed: February 9, 2022
    Publication date: August 10, 2023
    Applicant: Varjo Technologies Oy
    Inventor: Ari Antti Erik Peuhkurinen
  • Publication number: 20230245408
    Abstract: A system includes server(s) configured to: receive plurality of images of real-world environment captured by camera(s); process a number of images to detect plurality of objects present in a real-world environment and generate a three-dimensional environment model of the real-world environment; classify each of the objects as either a static or dynamic object; receive current image(s) of the real-world environment; process the current image(s) to detect object(s); determine whether or not the object(s) is/are from amongst the plurality of objects; determine whether the object(s) is a static object or dynamic object when it is determined that the object(s) is/are from amongst the plurality of objects; and for each dynamic object that is represented in the three-dimensional environment model but not in a current image(s), apply a first visual effect to a representation of the dynamic object in the three-dimensional environment model for indicating staleness of the representation.
    Type: Application
    Filed: February 2, 2022
    Publication date: August 3, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Petteri Timonen
  • Publication number: 20230245345
    Abstract: Disclosed is a system (100) for implementing object-based camera calibration, the system comprising first camera (102) and processor(s) (104) configured to: detect occurrence of calibration event(s); obtain object information indicative of actual features of calibration objects; capture image(s); detect features of at least portion of real-world object(s) represented in image(s) (200); identify calibration object(s) that match real-world object(s), based on comparison of detected features of at least portion of real-world object(s) with actual features of calibration objects; determine camera calibration parameters, based on differences between actual features of calibration object(s) and detected features of at least portion of real-world object(s); and correct distortion(s) in the image(s) and subsequent images captured by first camera using the camera calibration parameters.
    Type: Application
    Filed: February 2, 2022
    Publication date: August 3, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Ari Antti Erik Peuhkurinen, Kai Inha
  • Publication number: 20230213755
    Abstract: An imaging system including: first camera and second camera; depth-mapping means; gaze-tracking means; and processor configured to: generate depth map of real-world scene; determine gaze directions of first eye and second eye; identify line of sight and conical region of interest; determine optical depths of first object and second object present in conical region; determine one of first camera and second camera having lesser occlusion in real-world scene; adjust optical focus of one of first camera and second camera to focus on one of first object and second object having greater optical depth, and adjust optical focus of another of first camera and second camera to focus on another of first object and second object; and capture first image(s) and second image(s) using adjusted optical focuses of cameras.
    Type: Application
    Filed: January 3, 2022
    Publication date: July 6, 2023
    Applicant: Varjo Technologies Oy
    Inventor: Mikko Ollila
  • Patent number: 11688040
    Abstract: An imaging system for correcting visual artifacts during production of extended-reality images for display apparatus. The imaging system includes at least first camera and second camera for capturing first image and second image of real-world environment, respectively; and processor(s) configured to: analyse first and second images to identify visual artifact(s) and determine image segment of one of first image and second image that corresponds to visual artifact(s); generate image data for image segment, based on at least one of: information pertaining to virtual object, other image segment(s) neighbouring image segment, corresponding image segment in other of first image and second image, previous extended-reality image(s), photogrammetric model of real-world environment; and process one of first image and second image, based on image data, to produce extended-reality image for display apparatus.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: June 27, 2023
    Assignee: Varjo Technologies Oy
    Inventor: Mikko Ollila
  • Patent number: 11688046
    Abstract: A system including image sensor(s) including a plurality of pixels arranged on a photo-sensitive surface thereof; and image signal processor(s) configured to: receive, from image sensor(s), a plurality of image signals captured by corresponding pixels of image sensor(s); and process the plurality of image signals to generate at least one image, wherein, when processing, image signal processor(s) is configured to: determine, for a given image signal to be processed, a position of a given pixel on the photo-sensitive surface that is employed to capture the given image signal; and selectively perform a sequence of image signal processes on the given image signal and control a plurality of parameters employed for performing the sequence of image signal processes, based on the position of the given pixel.
    Type: Grant
    Filed: April 23, 2021
    Date of Patent: June 27, 2023
    Assignee: Varjo Technologies Oy
    Inventors: Mikko Ollila, Kai Inha
  • Publication number: 20230195217
    Abstract: A display apparatus including: light source(s); gaze-tracking means; and processor(s) configured to: determine gaze directions of user's eyes; send, to rendering server, information indicative of gaze direction determined at first time instant; receive image frame(s) generated according to gaze, and being optionally timestamped with second time instant; display image frame(s) at third time instant; determine time lag between any one of: first time instant and third time instant, or second time instant and third time instant; detect whether or not time lag exceeds first predefined threshold; when time lag exceeds first predefined threshold, switch on gaze-lock mode; select forward line of vision as fixed gaze direction; send, to rendering server, information indicative of fixed gaze; and receive image frames generated according to fixed gaze; and display image frames.
    Type: Application
    Filed: December 17, 2021
    Publication date: June 22, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Ari Antti Erik Peuhkurinen, Evgeny Zuev
  • Publication number: 20230186500
    Abstract: A computer-implemented method including: capturing visible-light images via visible-light camera(s) from view points in real-world environment, wherein 3D positions of view points are represented in coordinate system; dividing 3D space of real-world environment into 3D grid of convex-polyhedral regions; creating 3D data structure including nodes representing convex-polyhedral regions of 3D space; determining 3D positions of pixels of visible-light images based on 3D positions of view points; dividing each visible-light image into portions, wherein 3D positions of pixels of given portion of said visible-light image fall inside corresponding convex-polyhedral region; and storing, in each node, portions of visible-light images whose pixels' 3D positions fall inside corresponding convex-polyhedral region, wherein each portion of visible-light image is stored in corresponding node.
    Type: Application
    Filed: December 10, 2021
    Publication date: June 15, 2023
    Applicant: Varjo Technologies Oy
    Inventors: Mikko Strandborg, Petteri Timonen
  • Patent number: 11675430
    Abstract: A display apparatus including: light source(s); gaze-tracking means; and processor(s) configured to: determine gaze directions of user's eyes; send, to rendering server, information indicative of gaze direction determined at first time instant; receive image frame(s) generated according to gaze, and being optionally timestamped with second time instant; display image frame(s) at third time instant; determine time lag between any one of: first time instant and third time instant, or second time instant and third time instant; detect whether or not time lag exceeds first predefined threshold; when time lag exceeds first predefined threshold, switch on gaze-lock mode; select forward line of vision as fixed gaze direction; send, to rendering server, information indicative of fixed gaze; and receive image frames generated according to fixed gaze; and display image frames.
    Type: Grant
    Filed: December 17, 2021
    Date of Patent: June 13, 2023
    Assignee: Varjo Technologies Oy
    Inventors: Ari Antti Erik Peuhkurinen, Evgeny Zuev