Patents by Inventor Leonid Naimark

Leonid Naimark has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220211447
    Abstract: A camera tracking bar of a camera tracking system for computer assisted surgery navigation. The camera tracking bar includes a first set of stereo tracking cameras having first resolution, first field of view, and spaced apart on the camera tracking bar by a first baseline distance. The camera tracking bar also includes a second set of stereo tracking cameras having second resolution, second field of view, and spaced apart on the camera tracking bar by a second baseline distance that is less than the first baseline distance. The second set of stereo tracking cameras is positioned between the first set of stereo tracking cameras, and the resolution and/or the field of view of the second set of stereo tracking cameras is different from the resolution and/or the field of view of the first set of stereo tracking cameras. A communication interface provides camera video streams to the camera tracking subsystem.
    Type: Application
    Filed: March 21, 2022
    Publication date: July 7, 2022
    Inventors: Thomas Calloway, Leonid Naimark
  • Patent number: 11317973
    Abstract: A camera tracking bar of a camera tracking system for computer assisted surgery navigation. The camera tracking bar includes a first set of stereo tracking cameras having first resolution, first field of view, and spaced apart on the camera tracking bar by a first baseline distance. The camera tracking bar also includes a second set of stereo tracking cameras having second resolution, second field of view, and spaced apart on the camera tracking bar by a second baseline distance that is less than the first baseline distance. The second set of stereo tracking cameras is positioned between the first set of stereo tracking cameras, and the resolution and/or the field of view of the second set of stereo tracking cameras is different from the resolution and/or the field of view of the first set of stereo tracking cameras. A communication interface provides camera video streams to the camera tracking subsystem.
    Type: Grant
    Filed: June 9, 2020
    Date of Patent: May 3, 2022
    Assignee: Globus Medical, Inc.
    Inventors: Thomas Calloway, Leonid Naimark
  • Publication number: 20220125522
    Abstract: Devices, systems, and methods for a robot-assisted surgery. Navigable instrumentation, which are capable of being navigated by a surgeon using the surgical robot system, and navigation software allow for the navigated placement of interbody fusion devices or other surgical devices.
    Type: Application
    Filed: February 26, 2021
    Publication date: April 28, 2022
    Inventors: Thomas Calloway, Amaya Raphaelson, Leonid Naimark
  • Publication number: 20210378755
    Abstract: A camera tracking bar of a camera tracking system for computer assisted surgery navigation. The camera tracking bar includes a first set of stereo tracking cameras having first resolution, first field of view, and spaced apart on the camera tracking bar by a first baseline distance. The camera tracking bar also includes a second set of stereo tracking cameras having second resolution, second field of view, and spaced apart on the camera tracking bar by a second baseline distance that is less than the first baseline distance. The second set of stereo tracking cameras is positioned between the first set of stereo tracking cameras, and the resolution and/or the field of view of the second set of stereo tracking cameras is different from the resolution and/or the field of view of the first set of stereo tracking cameras. A communication interface provides camera video streams to the camera tracking subsystem.
    Type: Application
    Filed: June 9, 2020
    Publication date: December 9, 2021
    Inventors: Thomas Calloway, Leonid Naimark
  • Patent number: 10018469
    Abstract: A method of using signal processing representations from IMU/INS devices to identify terrain types. Using orientation and pace invariant gait dynamics images (GDIs) to identify terrain types. Utilizing signal processing representations from IMU/INS devices to determine relative position in GPS-denied areas. Using orientation and pace invariant gait dynamics images (GDIs) to determine relative position in GPS-denied areas. A method of using signal processing representations from IMU/INS devices to determine absolute position using GDI terrain IDs. A method of using signal processing representations from IMU/INS devices to identity position relative to land classes. Using orientation and pace in variant gait dynamics images (GDIs) to identity position relative to land classes.
    Type: Grant
    Filed: June 21, 2016
    Date of Patent: July 10, 2018
    Assignee: BAE Systems Information and Electronic Systems Integration Inc.
    Inventors: Leonid Naimark, Yunbin Deng, Geoffrey S. Meltzner, Yu Zhong
  • Publication number: 20170363427
    Abstract: A method of using signal processing representations from IMU/INS devices to identify terrain types. Using orientation and pace invariant gait dynamics images (GDIs) to identify terrain types. Utilizing signal processing representations from IMU/INS devices to determine relative position in GPS-denied areas. Using orientation and pace invariant gait dynamics images (GDIs) to determine relative position in GPS-denied areas. A method of using signal processing representations from IMU/INS devices to determine absolute position using GDI terrain IDs. A method of using signal processing representations from IMU/INS devices to identity position relative to land classes. Using orientation and pace in variant gait dynamics images (GDIs) to identity position relative to land classes.
    Type: Application
    Filed: June 21, 2016
    Publication date: December 21, 2017
    Inventors: Leonid Naimark, Yunbin Deng, Geoffrey S. Meltzner, Yu Zhong
  • Publication number: 20140267234
    Abstract: A multi-device system for mobile devices to acquire and share 3D maps of an environment. The mobile devices determine features of the environment and construct a local map and coordinate system for the features identified by the mobile device. The mobile devices may create a joint map by joining the local map of another mobile device or by merging the local maps created by the mobile devices. To merge maps, the coordinate system of each system may be constrained in degrees of freedom using information from sensors on the devices to determine the global position and orientation of each device. When the devices operate on a joint map, the device share information about new features to extend the range of features on the map and share information about augmented reality objects manipulated by users of each device.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Inventors: Anselm Hook, Pierre Fite-Georgel, Matt Meisnieks, Anthony Maes, Marc Gardeya, Leonid Naimark
  • Publication number: 20130218461
    Abstract: A system and a method are disclosed for a dead reckoning module. The dead reckoning module receives sensor information from an inertial sensor module indicating translation and rotation information. The dead reckoning module determines the position of the inertial sensor module between two periods of time by calculating a movement from the a first time period to an intermediate time period, and from the second time period to the intermediate time period, and determines the total movement between the first and second time periods using the movements relating to the intermediate period.
    Type: Application
    Filed: February 22, 2013
    Publication date: August 22, 2013
    Inventor: Leonid Naimark
  • Publication number: 20130215230
    Abstract: A system and a method are disclosed for capturing real world objects and reconstructing a three-dimensional representation of real world objects. The position of the viewing system relative to the three-dimensional representation is calculated using information from a camera and an inertial motion unit. The position of the viewing system and the three-dimensional representation allow the viewing system to move relative to the real objects and enables virtual content to be shown with collision and occlusion with real world objects.
    Type: Application
    Filed: February 22, 2013
    Publication date: August 22, 2013
    Inventors: Matt Miesnieks, Silka Miesnieks, Yohan Baillot, Marc Gardeya, Leonid Naimark, Anthony Maes, John Sietsma
  • Patent number: 8224024
    Abstract: The spatial location and azimuth of an object are computed from the locations, in a single camera image, of exactly two points on the object and information about an orientation of the object. One or more groups of four or more collinear markers are located in an image, and for each group, first and second outer markers are determined, the distances from each outer marker to the nearest marker in the same group are compared, and the outer marker with a closer nearest marker is identified as the first outer marker. Based on known distances between the outer markers and the marker nearest the first outer marker, an amount of perspective distortion of the group of markers in the image is estimated. Based on the perspective distortion, relative distances from each other point in the group to one of the outer markers are determined. Based on the relative distances, the group is identified.
    Type: Grant
    Filed: October 4, 2006
    Date of Patent: July 17, 2012
    Assignee: InterSense, LLC
    Inventors: Eric Foxlin, Leonid Naimark
  • Publication number: 20120089292
    Abstract: A method for navigating a moving object (vehicle) utilizing a Navigation manager module and comprising the steps of: communicating with all sensors, processing units, mission manager and other vehicles navigation managers; configuring and reconfiguring sensors based on mission scenario objectives, in-vehicle and global constraints; sensor grouping according to relationship to the vehicle and environment, where an entire sensor group is seen by navigation manager as a single sensor; processing unit containing Update Filter; and a dynamically updated API database.
    Type: Application
    Filed: February 12, 2011
    Publication date: April 12, 2012
    Inventors: Leonid Naimark, William H. Weedon, III, Marcos Antonio Bergamo
  • Patent number: 7231063
    Abstract: A new fiducial design allows having thousands of different codes. These fiducials are printed on a standard black-and-white (or color) printer and easily could be mounted on a walls, ceiling and objects in a room. The design includes “solid” outside mono-color ring and 2-D dense inside coding scheme. Image processing algorithms are implemented in Smart Camera with a built-in DSP to run all required image-processing tasks. A tracking system implementation includes an inertial measurement unit and one outward-looking wide-angle Smart Camera with possible extensions to number of stationary inward-looking cameras. The system operates in various real-world lighting conditions without any user intervention due to homomorphic image processing processes for extracting fiducials in the presence of very non-uniform lighting.
    Type: Grant
    Filed: August 9, 2002
    Date of Patent: June 12, 2007
    Assignee: InterSense, Inc.
    Inventors: Leonid Naimark, Eric Foxlin
  • Publication number: 20070081695
    Abstract: The spatial location and azimuth of an object are computed from the locations, in a single camera image, of exactly two points on the object and information about an orientation of the object. One or more groups of four or more collinear markers are located in an image, and for each group, first and second outer markers are determined, the distances from each outer marker to the nearest marker in the same group are compared, and the outer marker with a closer nearest marker is identified as the first outer marker. Based on known distances between the outer markers and the marker nearest the first outer marker, an amount of perspective distortion of the group of markers in the image is estimated. Based on the perspective distortion, relative distances from each other point in the group to one of the outer markers are determined. Based on the relative distances, the group is identified.
    Type: Application
    Filed: October 4, 2006
    Publication date: April 12, 2007
    Inventors: Eric Foxlin, Leonid Naimark
  • Publication number: 20040028258
    Abstract: A new fiducial design allows having thousands of different codes. These fiducials are printed on a standard black-and-white (or color) printer and easily could be mounted on a walls, ceiling and objects in a room. The design includes “solid” outside mono-color ring and 2-D dense inside coding scheme. Image processing algorithms are implemented in Smart Camera with a built-in DSP to run all required image-processing tasks. A tracking system implementation includes an inertial measurement unit and one outward-looking wide-angle Smart Camera with possible extensions to number of stationary inward-looking cameras. The system operates in various real-world lighting conditions without any user intervention due to homomorphic image processing processes for extracting fiducials in the presence of very non-uniform lighting.
    Type: Application
    Filed: August 9, 2002
    Publication date: February 12, 2004
    Inventors: Leonid Naimark, Eric Foxlin