Patents by Inventor Benjamin Joseph Uscinski
Benjamin Joseph Uscinski has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12100098Abstract: A method to reconstruct an environment is provided. The method makes available to a wide variety of XR applications fresh and accurate 3D reconstruction data of environments with low processing time and low usage of computational resources and storage spaces. The 3D reconstruction data are structured in a way to be efficiently shared between users for multi-user experiences. The method includes obtaining plane segments of an environment, identifying surface planes of the environment by, for example, filtering and grouping the plane segments or ad hoc selection of the plane segments by a user, and inferring corner points of the environment based on the surface planes. The corner points are used to build a 3D representation of the environment when an XR application requires.Type: GrantFiled: October 20, 2022Date of Patent: September 24, 2024Assignee: Magic Leap, Inc.Inventor: Benjamin Joseph Uscinski
-
Publication number: 20230037459Abstract: A method to reconstruct an environment is provided. The method makes available to a wide variety of XR applications fresh and accurate 3D reconstruction data of environments with low processing time and low usage of computational resources and storage spaces. The 3D reconstruction data are structured in a way to be efficiently shared between users for multi-user experiences. The method includes obtaining plane segments of an environment, identifying surface planes of the environment by, for example, filtering and grouping the plane segments or ad hoc selection of the plane segments by a user, and inferring corner points of the environment based on the surface planes. The corner points are used to build a 3D representation of the environment when an XR application requires.Type: ApplicationFiled: October 20, 2022Publication date: February 9, 2023Applicant: Magic Leap, Inc.Inventor: Benjamin Joseph Uscinski
-
Patent number: 11508141Abstract: A method to reconstruct an environment is provided. The method makes available to a wide variety of XR applications fresh and accurate 3D reconstruction data of environments with low processing time and low usage of computational resources and storage spaces. The 3D reconstruction data are structured in a way to be efficiently shared between users for multi-user experiences. The method includes obtaining plane segments of an environment, identifying surface planes of the environment by, for example, filtering and grouping the plane segments or ad hoc selection of the plane segments by a user, and inferring corner points of the environment based on the surface planes. The corner points are used to build a 3D representation of the environment when an XR application requires.Type: GrantFiled: June 25, 2020Date of Patent: November 22, 2022Assignee: Magic Leap, Inc.Inventor: Benjamin Joseph Uscinski
-
Patent number: 11379036Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: GrantFiled: June 18, 2021Date of Patent: July 5, 2022Assignee: Magic Leap, Inc.Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
-
Publication number: 20220011859Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: ApplicationFiled: June 18, 2021Publication date: January 13, 2022Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
-
Patent number: 11068055Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: GrantFiled: April 22, 2020Date of Patent: July 20, 2021Assignee: Magic Leap, Inc.Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
-
Publication number: 20210004630Abstract: A method to reconstruct an environment is provided. The method makes available to a wide variety of XR applications fresh and accurate 3D reconstruction data of environments with low processing time and low usage of computational resources and storage spaces. The 3D reconstruction data are structured in a way to be efficiently shared between users for multi-user experiences. The method includes obtaining plane segments of an environment, identifying surface planes of the environment by, for example, filtering and grouping the plane segments or ad hoc selection of the plane segments by a user, and inferring corner points of the environment based on the surface planes. The corner points are used to build a 3D representation of the environment when an XR application requires.Type: ApplicationFiled: June 25, 2020Publication date: January 7, 2021Applicant: Magic Leap, Inc.Inventor: Benjamin Joseph Uscinski
-
Publication number: 20200249755Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: ApplicationFiled: April 22, 2020Publication date: August 6, 2020Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
-
Patent number: 10671160Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: GrantFiled: May 30, 2018Date of Patent: June 2, 2020Assignee: Magic Leap, Inc.Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
-
Publication number: 20180348861Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.Type: ApplicationFiled: May 30, 2018Publication date: December 6, 2018Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart