Patents by Inventor Juri Platonov

Juri Platonov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11709542
    Abstract: A multi-user virtual reality system for providing a virtual reality experience to a plurality of users in a body of water includes a reference system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes equipment configured to be mounted to a first user in the body of water, wherein the equipment includes a first display unit and a first signal emitting or receiving system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes equipment configured to be mounted to a second user in the body of water, wherein the equipment includes a second display unit and a second signal emitting or receiving system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes a data processing system including one or more data processing units.
    Type: Grant
    Filed: July 30, 2021
    Date of Patent: July 25, 2023
    Assignee: SHHUNA GMBH
    Inventors: Juri Platonov, Pawel Dürr
  • Patent number: 11378427
    Abstract: A method of sensor fusion for systems including at least one sensor and at least one target object is provided. The method includes receiving configuration data at a processing device. The configuration data includes a description of a first system including one or more sensors and one or more target objects. The configuration data includes an indication that one or more geometric parameters and/or one or more sensor parameters of the first system are unknown. The method includes receiving an instruction at the processing device that the received configuration data is to be adjusted into adjusted configuration data. The adjusted configuration data includes a description of a second system including one or more sensors and one or more target objects, wherein the second system is different from the first system. The adjusted configuration data includes an indication that one or more geometric parameters and/or one or more sensor parameters of the second system are unknown.
    Type: Grant
    Filed: May 27, 2020
    Date of Patent: July 5, 2022
    Assignee: Shhuna GmbH
    Inventors: Juri Platonov, Pawel Kaczmarczyk
  • Publication number: 20220043507
    Abstract: A multi-user virtual reality system for providing a virtual reality experience to a plurality of users in a body of water includes a reference system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes equipment configured to be mounted to a first user in the body of water, wherein the equipment includes a first display unit and a first signal emitting or receiving system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes equipment configured to be mounted to a second user in the body of water, wherein the equipment includes a second display unit and a second signal emitting or receiving system adapted for emitting and/or receiving signals. The multi-user virtual reality system includes a data processing system including one or more data processing units.
    Type: Application
    Filed: July 30, 2021
    Publication date: February 10, 2022
    Inventors: Juri Platonov, Pawel Dürr
  • Publication number: 20200378808
    Abstract: A method of sensor fusion for systems including at least one sensor and at least one target object is provided. The method includes receiving configuration data at a processing device. The configuration data includes a description of a first system including one or more sensors and one or more target objects. The configuration data includes an indication that one or more geometric parameters and/or one or more sensor parameters of the first system are unknown. The method includes receiving an instruction at the processing device that the received configuration data is to be adjusted into adjusted configuration data. The adjusted configuration data includes a description of a second system including one or more sensors and one or more target objects, wherein the second system is different from the first system. The adjusted configuration data includes an indication that one or more geometric parameters and/or one or more sensor parameters of the second system are unknown.
    Type: Application
    Filed: May 27, 2020
    Publication date: December 3, 2020
    Inventors: Juri Platonov, Pawel Kaczmarczyk
  • Patent number: 9262828
    Abstract: A method for determining the orientation of a video camera attached to a vehicle relative to the vehicle coordinate system is disclosed. Using a video camera an incremental motion is measured based on the optical flow of the video stream. A linear motion component and a rotational motion component are determined. Based on a determined rotational angle the incremental motion is classified as linear or as rotational. The directional vector of the linear motion is used to estimate the longitudinal axis of the vehicle, and the rotational vector of the rotational motion is used for estimating the normal to the vehicle plane. Based thereon the orientation of the camera follows.
    Type: Grant
    Filed: January 17, 2013
    Date of Patent: February 16, 2016
    Assignee: ESG Elektroniksystem-Und Logistik-GMBH
    Inventors: Christopher Pflug, Juri Platonov, Thomas Gebauer, Pawel Kaczmarczyk
  • Patent number: 9129164
    Abstract: A driver assist system is provided that generates a video signal representing a vehicle environment outside a vehicle. At least one feature is extracted from the video signal. A reference is selected from a plurality of reference features stored as location attributes in a map database. The extracted feature is compared to at least one reference feature. An object in the vehicle environment is identified based on the comparison of the extracted feature and the reference feature. An indication is provided to a driver of the vehicle on the basis of the identified object. In one example, the system includes a video capturing device, an indicating device, a vehicle-based processing resource and access to a map database server. Processing tasks may be distributed among the vehicle-based processing resource and an external processing resource.
    Type: Grant
    Filed: February 3, 2010
    Date of Patent: September 8, 2015
    Assignee: Harman Becker Automotive Systems GmbH
    Inventors: Juri Platonov, Alexey Pryakhin, Peter Kunath
  • Publication number: 20150036885
    Abstract: A method for determining the orientation of a video camera attached to a vehicle relative to the vehicle coordinate system, comprising: Measuring an incremental vehicle motion in the coordinate system of the video camera based on the optical flow of the video stream of the video camera; representing the incremental vehicle motion by a motion consisting of a linear motion component and a rotational motion, wherein the linear motion component is represented by a directional vector and the rotational motion is represented by a rotation axis and a rotational angle; classifying the measured incremental vehicle motion as a linear motion or as a rotational motion based on the determined rotational angle; measuring at least one incremental motion which has been classified as a linear motion; measuring at least one incremental motion which has been classified as a rotational motion; and using the directional vector of the at least one measured linear motion for estimating the longitudinal axis of the vehicle; using
    Type: Application
    Filed: January 17, 2013
    Publication date: February 5, 2015
    Inventors: Christopher Pflug, Juri Platonov, Thomas Gebauer, Pawel Kaczmarczyk
  • Patent number: 8929604
    Abstract: A vision system comprises a camera that captures an image and a processor coupled to process the received image to determine at least one feature descriptor for the image. The processor includes an interface to access annotated map data that includes geo-referenced feature descriptors. The processor is configured to perform a matching procedure between the at least one feature descriptor determined for the at least one image and the retrieved geo-referenced feature descriptors.
    Type: Grant
    Filed: November 9, 2011
    Date of Patent: January 6, 2015
    Assignee: Harman Becker Automotive Systems GmbH
    Inventors: Juri Platonov, Alexey Pryakhin, Peter Kunath
  • Publication number: 20120114178
    Abstract: A vision system comprises a camera that captures an image and a processor coupled to process the received image to determine at least one feature descriptor for the image. The processor includes an interface to access annotated map data that includes geo-referenced feature descriptors. The processor is configured to perform a matching procedure between the at least one feature descriptor determined for the at least one image and the retrieved geo-referenced feature descriptors.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 10, 2012
    Inventors: Juri Platonov, Alexey Pryakhin, Peter Kunath
  • Publication number: 20110090071
    Abstract: A driver assist system is provided that generates a video signal representing a vehicle environment outside a vehicle. At least one feature is extracted from the video signal. A reference is selected from a plurality of reference features stored as location attributes in a map database. The extracted feature is compared to at least one reference feature. An object in the vehicle environment is identified based on the comparison of the extracted feature and the reference feature. An indication is provided to a driver of the vehicle on the basis of the identified object. In one example, the system includes a video capturing device, an indicating device, a vehicle-based processing resource and access to a map database server. Processing tasks may be distributed among the vehicle-based processing resource and an external processing resource.
    Type: Application
    Filed: February 3, 2010
    Publication date: April 21, 2011
    Applicant: Harman Becker Automotive Systems GmbH
    Inventors: Juri Platonov, Alexey Pryakhin, Peter Kunath
  • Patent number: 7889193
    Abstract: A data model which is designed for being superposed with an image of a real object in an optical object tracking process is determined by the following steps: providing a three-dimensional CAD model (10) for representing the real object, and thereafter there are different synthetic two-dimensional views (31 to 34) of said CAD model (10) generated. Each generated view (31 to 34) is subjected to edge extraction for determining at least one extracted edge (38, 39) in the respective view, with the edges (38, 39) extracted from said respective views (31 to 34) being transformed to a three-dimensional contour model (85, 91) corresponding to said data model to be determined. Permits rapid and efficient generation of a contour model as a data model intended for being superposed with an image of a real object.
    Type: Grant
    Filed: January 30, 2007
    Date of Patent: February 15, 2011
    Assignee: Metaio GmbH
    Inventors: Juri Platonov, Marion Langer
  • Patent number: 7592997
    Abstract: The invention relates to a system for determining the position of a user and/or a moving device by means of tracking methods, in particular for augmented reality applications, with an interface (9) to integrate at least one sensor type and/or data generator (1, 2, 3, 4) of a tracking method, a configuration unit (20) to describe communication between the tracking methods and/or tracking algorithms and at least one processing unit (5, 6, 7, 8, 10, 11, 12, 13, 16) to calculate the position of the user and/or the moving device based on the data supplied by the tracking methods and/or tracking algorithms.
    Type: Grant
    Filed: June 1, 2005
    Date of Patent: September 22, 2009
    Assignee: Siemens Aktiengesellschaft
    Inventors: Jan-Friso Evers-Senne, Jan-Michael Frahm, Mehdi Hamadou, Dirk Jahn, Peter Georg Meier, Juri Platonov, Didier Stricker, Jens Weidenhausen
  • Publication number: 20070182739
    Abstract: A data model which is designed for being superposed with an image of a real object in an optical object tracking process is determined by the following steps: providing a three-dimensional CAD model (10) for representing the real object, and thereafter there are different synthetic two-dimensional views (31 to 34) of said CAD model (10) generated. Each generated view (31 to 34) is subjected to edge extraction for determining at least one extracted edge (38, 39) in the respective view, with the edges (38, 39) extracted from said respective views (31 to 34) being transformed to a three-dimensional contour model (85, 91) corresponding to said data model to be determined. Permits rapid and efficient generation of a contour model as a data model intended for being superposed with an image of a real object.
    Type: Application
    Filed: January 30, 2007
    Publication date: August 9, 2007
    Inventors: Juri Platonov, Marion Langer
  • Publication number: 20050275722
    Abstract: The invention relates to a system for determining the position of a user and/or a moving device by means of tracking methods, in particular for augmented reality applications, with an interface (9) to integrate at least one sensor type and/or data generator (1, 2, 3, 4) of a tracking method, a configuration unit (20) to describe communication between the tracking methods and/or tracking algorithms and at least one processing unit (5, 6, 7, 8, 10, 11, 12, 13, 16) to calculate the position of the user and/or the moving device based on the data supplied by the tracking methods and/or tracking algorithms.
    Type: Application
    Filed: June 1, 2005
    Publication date: December 15, 2005
    Inventors: Jan-Friso Evers-Senne, Jan-Michael Frahm, Mehdi Hamadou, Dirk Jahn, Peter Meier, Juri Platonov, Didier Stricker, Jens Weidenhausen