Patents by Inventor Oleg Naroditsky

Oleg Naroditsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104778
    Abstract: Methods for performing a camera calibration process for outward-facing cameras on devices such as head-mounted display devices are disclosed. Using cameras with overlapping fields of view, relative rotational parameters of the cameras with respect to one another may be determined using an optimization technique such as a two-view bundle adjustment algorithm. A statistical analysis of the relative rotational parameters of the cameras, determined for a plurality of moments in time, may then be made to provide updated relative rotational parameters for recalibration of the cameras. A camera calibration process, such as those disclosed, may not depend on tracking points of interest over multiple moments in time, but rather on a convergence of the relative rotational parameters determined for respective moments in time.
    Type: Application
    Filed: September 13, 2023
    Publication date: March 28, 2024
    Applicant: Apple Inc.
    Inventors: Tianheng Wang, Stergios Roumeliotis, Shuntaro Yamazaki, Oleg Naroditsky
  • Patent number: 11943679
    Abstract: Location mapping and navigation user interfaces may be generated and presented via mobile computing devices. A mobile device may detect its location and orientation using internal systems, and may capture image data using a device camera. The mobile device also may retrieve map information from a map server corresponding to the current location of the device. Using the image data captured at the device, the current location data, and the corresponding local map information, the mobile device may determine or update a current orientation reading for the device. Location errors and updated location data also may be determined for the device, and a map user interface may be generated and displayed on the mobile device using the updated device orientation and/or location data.
    Type: Grant
    Filed: July 8, 2022
    Date of Patent: March 26, 2024
    Assignee: Apple Inc.
    Inventors: Robert William Mayor, Isaac T. Miller, Adam S. Howell, Vinay R. Majjigi, Oliver Ruepp, Daniel Ulbricht, Oleg Naroditsky, Christian Lipski, Sean P. Cier, Hyojoon Bae, Saurabh Godha, Patrick J. Coleman
  • Publication number: 20240095958
    Abstract: Methods for performing a calibration process for outward-facing cameras on devices such as head-mounted display devices are disclosed. Extrinsic parameters of the camera are first estimated using inputs to the calibration process such as information from an inertial measurement unit and points of interest within images captured by the camera that are tracked with time. Then, extrinsic and intrinsic parameters are concurrently determined in an optimization problem such that updated values of said parameters may be stored and used by applications that run on the device and make use of the camera. The calibration process may be extended to concurrently calibrate multiple cameras based, at least in part, on information from the inertial measurement unit that is local to the camera.
    Type: Application
    Filed: September 13, 2023
    Publication date: March 21, 2024
    Applicant: Apple Inc.
    Inventors: Shuntaro Yamazaki, Ravi Teja Sukhavasi, Oleg Naroditsky, Stergios Roumeliotis, Daniel C Byrnes
  • Publication number: 20240019522
    Abstract: Performing a combination localization technique includes determining a target localization parameter, selecting a combination localization technique in accordance with the target localization parameter, including a first localization technique associated with a first power error time profile, and a second localization technique associated with a second power error time profile. A device location is determined using the first localization technique for a first time period in accordance with the first power error time profile, and an updated device location is determined using the second localization technique in response to a triggering condition. A further updated device location is determined using the first localization technique following the first time period based on the updated device location. At least one of a combined energy value and a combined maximum error rate for the combination localization technique satisfies the target localization parameter.
    Type: Application
    Filed: July 17, 2023
    Publication date: January 18, 2024
    Inventors: Siddharth S. Hazra, Brad W. Simeral, Oleg Naroditsky, Lukas Polok, Adam S. Howell, Mehrad Tavakoli
  • Patent number: 11740321
    Abstract: Systems, methods, and computer readable media to track and estimate the accuracy of a visual inertial odometry (VIO) system. Various embodiments are able to receive one or more VIO feature measurements associated with a set of image frames from a VIO system and generate a plurality of feature models to estimate health values for the VIO system. The various embodiments determine a plurality of feature health values with the feature models based on the VIO feature measurements and compare the feature health values with ground truth health scores associated with the set of image frames to determine one or more errors. The feature model parameters are updated based on the comparison with the feature health values with ground truth health scores.
    Type: Grant
    Filed: March 20, 2018
    Date of Patent: August 29, 2023
    Assignee: Apple Inc.
    Inventors: Oleg Naroditsky, Kuen-Han Lin, Dimitrios Kottas
  • Publication number: 20220345849
    Abstract: Location mapping and navigation user interfaces may be generated and presented via mobile computing devices. A mobile device may detect its location and orientation using internal systems, and may capture image data using a device camera. The mobile device also may retrieve map information from a map server corresponding to the current location of the device. Using the image data captured at the device, the current location data, and the corresponding local map information, the mobile device may determine or update a current orientation reading for the device. Location errors and updated location data also may be determined for the device, and a map user interface may be generated and displayed on the mobile device using the updated device orientation and/or location data.
    Type: Application
    Filed: July 8, 2022
    Publication date: October 27, 2022
    Applicant: Apple Inc.
    Inventors: Robert William Mayor, Isaac T. Miller, Adam S. Howell, Vinay R. Majjigi, Oliver Ruepp, Daniel Ulbricht, Oleg Naroditsky, Christian Lipski, Sean P. Cier, Hyojoon Bae, Saurabh Godha, Patrick J. Coleman
  • Patent number: 11412350
    Abstract: Location mapping and navigation user interfaces may be generated and presented via mobile computing devices. A mobile device may detect its location and orientation using internal systems, and may capture image data using a device camera. The mobile device also may retrieve map information from a map server corresponding to the current location of the device. Using the image data captured at the device, the current location data, and the corresponding local map information, the mobile device may determine or update a current orientation reading for the device. Location errors and updated location data also may be determined for the device, and a map user interface may be generated and displayed on the mobile device using the updated device orientation and/or location data.
    Type: Grant
    Filed: September 19, 2019
    Date of Patent: August 9, 2022
    Assignee: Apple Inc.
    Inventors: Robert William Mayor, Isaac T. Miller, Adam S. Howell, Vinay R. Majjigi, Oliver Ruepp, Daniel Ulbricht, Oleg Naroditsky, Christian Lipski, Sean P. Cier, Hyojoon Bae, Saurabh Godha, Patrick J. Coleman
  • Publication number: 20220092859
    Abstract: Implementations of the subject technology provide extended reality display devices that can be used on and/or off of a moving platform. Systems and methods are disclosed for separating out the motion of the moving platform from other motions of the device so that virtual content can be displayed without erroneous motions caused by the motion of the moving platform. The subject technology can provide extended reality settings on any suitable moveable platform such as in a car, a watercraft, an aircraft, a train, or any other vehicle.
    Type: Application
    Filed: September 17, 2021
    Publication date: March 24, 2022
    Inventors: Abdelhamid DINE, Kuen-Han LIN, Stergios ROUMELIOTIS, Oleg NARODITSKY
  • Patent number: 11127161
    Abstract: In some implementations, a first electronic device including a first image sensor uses a processor to perform a method. The method involves obtaining a first set of keyframes based on images of a physical environment captured by the first image sensor. The method generates a mapping defining relative locations of keyframes of the first set of keyframes. The method receives a keyframe corresponding to an image of the physical environment captured at a second, different electronic device and localizes the received keyframe to the mapping. The method then receives an anchor from the second electronic device that defines a position of a virtual object relative to the keyframe. The method displays a CGR environment including the virtual object at a location based on the anchor and the mapping.
    Type: Grant
    Filed: July 13, 2020
    Date of Patent: September 21, 2021
    Assignee: Apple Inc.
    Inventors: Abdelhamid Dine, Kuen-Han Lin, Oleg Naroditsky
  • Patent number: 11118911
    Abstract: A method of creating a local map includes: receiving, at a mobile electronic data processing apparatus, a request from a server to generate a map of a specified destination; sending to the server a message accepting the request to generate the map responsive to receiving, at a user input of the mobile electronic data processing device, a user command indicating acceptance of the request; generating, using a processor, information related to construction of the map; an transmitting, from the mobile electronic data processing apparatus, the information related to construction of the map.
    Type: Grant
    Filed: March 16, 2018
    Date of Patent: September 14, 2021
    Assignee: Apple Inc.
    Inventors: Alex Flint, Oleg Naroditsky, Andriy Grygorenko, Oriel Bergig
  • Publication number: 20210092555
    Abstract: Location mapping and navigation user interfaces may be generated and presented via mobile computing devices. A mobile device may detect its location and orientation using internal systems, and may capture image data using a device camera. The mobile device also may retrieve map information from a map server corresponding to the current location of the device. Using the image data captured at the device, the current location data, and the corresponding local map information, the mobile device may determine or update a current orientation reading for the device. Location errors and updated location data also may be determined for the device, and a map user interface may be generated and displayed on the mobile device using the updated device orientation and/or location data.
    Type: Application
    Filed: September 19, 2019
    Publication date: March 25, 2021
    Inventors: Robert William Mayor, Isaac T. Miller, Adam S. Howell, Vinay R. Majjigi, Oliver Ruepp, Daniel Ulbricht, Oleg Naroditsky, Christian Lipski, Sean P. Cier, Hyojoon Bae, Saurabh Godha, Patrick J. Coleman
  • Publication number: 20200349735
    Abstract: In some implementations, a first electronic device including a first image sensor uses a processor to perform a method. The method involves obtaining a first set of keyframes based on images of a physical environment captured by the first image sensor. The method generates a mapping defining relative locations of keyframes of the first set of keyframes. The method receives a keyframe corresponding to an image of the physical environment captured at a second, different electronic device and localizes the received keyframe to the mapping. The method then receives an anchor from the second electronic device that defines a position of a virtual object relative to the keyframe. The method displays a CGR environment including the virtual object at a location based on the anchor and the mapping.
    Type: Application
    Filed: July 13, 2020
    Publication date: November 5, 2020
    Inventors: Abdelhamid Dine, Kuen-Han Lin, Oleg Naroditsky
  • Patent number: 10748302
    Abstract: In some implementations, a first electronic device including a first image sensor uses a processor to perform a method. The method involves obtaining a first set of keyframes based on images of a physical environment captured by the first image sensor. The method generates a mapping defining relative locations of keyframes of the first set of keyframes. The method receives a keyframe corresponding to an image of the physical environment captured at a second, different electronic device and localizes the received keyframe to the mapping. The method then receives an anchor from the second electronic device that defines a position of a virtual object relative to the keyframe. The method displays a CGR environment including the virtual object at a location based on the anchor and the mapping.
    Type: Grant
    Filed: May 2, 2019
    Date of Patent: August 18, 2020
    Assignee: Apple Inc.
    Inventors: Abdelhamid Dine, Kuen-Han Lin, Oleg Naroditsky
  • Publication number: 20190164040
    Abstract: Systems, methods, and computer readable media to track and estimate the accuracy of a visual inertial odometry (VIO) system. Various embodiments are able to receive one or more VIO feature measurements associated with a set of image frames from a VIO system and generate a plurality of feature models to estimate health values for the VIO system. The various embodiments determine a plurality of feature health values with the feature models based on the VIO feature measurements and compare the feature health values with ground truth health scores associated with the set of image frames to determine one or more errors. The feature model parameters are updated based on the comparison with the feature health values with ground truth health scores.
    Type: Application
    Filed: March 20, 2018
    Publication date: May 30, 2019
    Inventors: Oleg Naroditsky, Kuen-Han Lin, Dimitrios Kottas
  • Patent number: 10152795
    Abstract: A method includes: receiving sensor measurements from a pre-processing module, in which the sensor measurements include image data and inertial data for a device; transferring, using a processor, information derived from the sensor measurements, from a first set of variables associated with a first window of time to a second set of variables associated with a second window of time, in which the first and second windows consecutively overlap in time; and outputting, to a post-processing module, a state of the device based on the transferred information.
    Type: Grant
    Filed: August 12, 2016
    Date of Patent: December 11, 2018
    Assignees: Apple Inc., Regents of the University of Minnesota
    Inventors: Alex Flint, Oleg Naroditsky, Christopher P. Broaddus, Andriy Grygorenko, Stergios Roumeliotis, Oriel Bergig
  • Publication number: 20180283877
    Abstract: A method of creating a local map includes: receiving, at a mobile electronic data processing apparatus, a request from a server to generate a map of a specified destination; sending to the server a message accepting the request to generate the map responsive to receiving, at a user input of the mobile electronic data processing device, a user command indicating acceptance of the request; generating, using a processor, information related to construction of the map; an transmitting, from the mobile electronic data processing apparatus, the information related to construction of the map.
    Type: Application
    Filed: March 16, 2018
    Publication date: October 4, 2018
    Inventors: Alex Flint, Oleg Naroditsky, Andriy Grygorenko, Oriel Bergig
  • Patent number: 9964409
    Abstract: A method of creating a local map includes: receiving, at a mobile electronic data processing apparatus, a request from a server to generate a map of a specified destination; sending to the server a message accepting the request to generate the map responsive to receiving, at a user input of the mobile electronic data processing device, a user command indicating acceptance of the request; generating, using a processor, information related to construction of the map; an transmitting, from the mobile electronic data processing apparatus, the information related to construction of the map.
    Type: Grant
    Filed: May 27, 2015
    Date of Patent: May 8, 2018
    Assignee: Apple Inc.
    Inventors: Alex Flint, Oleg Naroditsky, Andriy Grygorenko, Oriel Bergig
  • Patent number: 9892563
    Abstract: A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.
    Type: Grant
    Filed: March 21, 2017
    Date of Patent: February 13, 2018
    Assignee: SRI International
    Inventors: Rakesh Kumar, Taragay Oskiper, Oleg Naroditsky, Supun Samarasekera, Zhiwei Zhu, Janet Yonga Kim Knowles
  • Patent number: 9734414
    Abstract: A system and method for efficiently locating in 3D an object of interest in a target scene using video information captured by a plurality of cameras. The system and method provide for multi-camera visual odometry wherein pose estimates are generated for each camera by all of the cameras in the multi-camera configuration. Furthermore, the system and method can locate and identify salient landmarks in the target scene using any of the cameras in the multi-camera configuration and compare the identified landmark against a database of previously identified landmarks. In addition, the system and method provide for the integration of video-based pose estimations with position measurement data captured by one or more secondary measurement sensors, such as, for example, Inertial Measurement Units (IMUs) and Global Positioning System (GPS) units.
    Type: Grant
    Filed: August 25, 2015
    Date of Patent: August 15, 2017
    Assignee: SRI International
    Inventors: Supun Samarasekera, Rakesh Kumar, Taragay Oskiper, Zhiwei Zhu, Oleg Naroditsky, Harpreet Sawhney
  • Publication number: 20170193710
    Abstract: A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.
    Type: Application
    Filed: March 21, 2017
    Publication date: July 6, 2017
    Inventors: Rakesh Kumar, Targay Oskiper, Oleg Naroditsky, Supun Samarasekera, Zhiwei Zhu, Janet Kim