Patents by Inventor Michael Kasper

Michael Kasper has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250094150
    Abstract: Systems and methods for synthesizing software application code for a software application. An example method includes receiving, from a user device, an initial prompt comprising a natural language description of a plurality of specification elements defining a functionality and a scope of a software application; synthesizing software application code for the software application by processing the initial prompt through a synthesis model, wherein the synthesis model is trained to: generate, based on the initial prompt, a specification document comprising the specification elements; and generate the software application code for the software application by determining a bidirectional mapping between the software application code and the specification elements; and deploying, to a deployment platform, the generated software application code for execution.
    Type: Application
    Filed: September 19, 2024
    Publication date: March 20, 2025
    Inventors: Michael Kasper, Calvin Smith, Fernando Nobre, Nima Keivan, Theodore Hoff
  • Publication number: 20250086824
    Abstract: Wearable systems and method for operation thereof incorporating headset and controller inside-out tracking are disclosed. A wearable system may include a headset and a controller. The wearable system may cause fiducials of the controller to flash. The wearable system may track a pose of the controller by capturing headset images using a headset camera, identifying the fiducials in the headset images, and tracking the pose of the controller based on the identified fiducials in the headset images and based on a pose of the headset. While tracking the pose of the controller, the wearable system may capture controller images using a controller camera. The wearable system may identify two-dimensional feature points in each controller image and determine three-dimensional map points based on the two-dimensional feature points and the pose of the controller.
    Type: Application
    Filed: November 22, 2024
    Publication date: March 13, 2025
    Applicant: Magic Leap, Inc.
    Inventors: Dominik Michael Kasper, Martin Georg Zahnert, Manel Quim Sanchez Nicuesa, Rafa Gomez-Jordana Manas, Nathan Yuki Baumli, Koon Keong Shee, Zachary C. Nienstedt, Emily Elizabeth Mount, Lomesh Agarwal, Andrea Lampart
  • Patent number: 12249024
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the preintegration term, and the wearable head device presents the virtual content based on the position of the device.
    Type: Grant
    Filed: February 12, 2024
    Date of Patent: March 11, 2025
    Assignee: Magic Leap, Inc.
    Inventors: Yu-Hsiang Huang, Evan Gregory Levine, Igor Napolskikh, Dominik Michael Kasper, Manel Quim Sanchez Nicuesa, Sergiu Sima, Benjamin Langmann, Ashwin Swaminathan, Martin Georg Zahnert, Blazej Marek Czuprynski, Joao Antonio Pereira Faro, Christoph Tobler, Omid Ghasemalizadeh
  • Publication number: 20250076973
    Abstract: Wearable systems and method for operation thereof incorporating headset and controller localization using headset cameras and controller fiducials are disclosed. A wearable system may include a headset and a controller. The wearable system may alternate between performing headset tracking and performing controller tracking by repeatedly capturing images using a headset camera of the headset during headset tracking frames and controller tracking frames. The wearable system may cause the headset camera to capture a first exposure image an exposure above a threshold and cause the headset camera to capture a second exposure image having an exposure below the threshold. The wearable system may determine a fiducial interval during which fiducials of the controller are to flash at a fiducial frequency and a fiducial period. The wearable system may cause the fiducials to flash during the fiducial interval in accordance with the fiducial frequency and the fiducial period.
    Type: Application
    Filed: November 22, 2024
    Publication date: March 6, 2025
    Applicant: Magic Leap, Inc.
    Inventors: Zachary C. Nienstedt, Daniel Roberts, Christopher Michael Lopez, Brian Edward Oliver Bucknor, Samuel A. Miller, Nathan Yuki Baumli, Dominik Michael Kasper, Manel Quim Sanchez Nicuesa, Andrea Lampart, Rafa Gomez-Jordana Manas, Martin Georg Zahnert, Nikola Stan, Emily Elizabeth Mount
  • Patent number: 12211271
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Grant
    Filed: October 9, 2023
    Date of Patent: January 28, 2025
    Assignee: MAGIC LEAP, INC.
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Patent number: 12209435
    Abstract: A door lock may include a latch bolt head and a blocking pin. The latch bolt head may be configured to move between an extended position and a retracted position to selectively engage a latch head pocket of a door jamb. The latch bolt head may also be configured to move between a first rotational position and a second rotational position, wherein the blocking pin is configured to prevent movement of the latch bolt head from the first rotational position to the second rotational position when the blocking pin is in an engaged position.
    Type: Grant
    Filed: March 11, 2021
    Date of Patent: January 28, 2025
    Assignee: Sargent Manufacturing Company
    Inventors: Michael Bedford, Brian R. Fournier, Scott Kasper, William Middelaer, David Nguyen, Ray Nolan, Christine Voelker, Todd C. Zimmer
  • Publication number: 20240233372
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Application
    Filed: October 9, 2023
    Publication date: July 11, 2024
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Publication number: 20240185510
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the preintegration term, and the wearable head device presents the virtual content based on the position of the device.
    Type: Application
    Filed: February 12, 2024
    Publication date: June 6, 2024
    Inventors: Yu-Hsiang HUANG, Evan Gregory LEVINE, Igor NAPOLSKIKH, Dominik Michael KASPER, Manel Quim SANCHEZ NICUESA, Sergiu SIMA, Benjamin LANGMANN, Ashwin SWAMINATHAN, Martin Georg ZAHNERT, Blazej Marek CZUPRYNSKI, Joao Antonio Pereira FARO, Christoph TOBLER, Omid GHASEMALIZADEH
  • Publication number: 20240135707
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Application
    Filed: October 8, 2023
    Publication date: April 25, 2024
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Publication number: 20240135656
    Abstract: A cross reality system enables portable devices to access stored maps and efficiently and accurately render virtual content specified in relation to those maps. The system may process images acquired with a portable device to quickly and accurately localize the portable device to the persisted maps by constraining the result of localization based on the estimated direction of gravity of a persisted map and the coordinate frame in which data in a localization request is posed. The system may actively align the data in the localization request with an estimated direction of gravity during the localization processing, and/or a portable device may establish a coordinate frame in which the data in the localization request is posed aligned with an estimated direction of gravity such that the subsequently acquired data for inclusion in a localization request, when posed in that coordinate frame, is passively aligned with the estimated direction of gravity.
    Type: Application
    Filed: December 26, 2023
    Publication date: April 25, 2024
    Applicant: Magic Leap, Inc.
    Inventors: Javier Victorio Gomez Gonzalez, Miguel Andres Granados Velasquez, Mukta Prasad, Dominik Michael Kasper, Eran Guendelman, Keng-Sheng Lin
  • Patent number: 11935180
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the preintegration term, and the wearable head device presents the virtual content based on the position of the device.
    Type: Grant
    Filed: April 7, 2022
    Date of Patent: March 19, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Yu-Hsiang Huang, Evan Gregory Levine, Igor Napolskikh, Dominik Michael Kasper, Manel Quim Sanchez Nicuesa, Sergiu Sima, Benjamin Langmann, Ashwin Swaminathan, Martin Georg Zahnert, Blazej Marek Czuprynski, Joao Antonio Pereira Faro, Christoph Tobler, Omid Ghasemalizadeh
  • Patent number: 11900547
    Abstract: A cross reality system enables portable devices to access stored maps and efficiently and accurately render virtual content specified in relation to those maps. The system may process images acquired with a portable device to quickly and accurately localize the portable device to the persisted maps by constraining the result of localization based on the estimated direction of gravity of a persisted map and the coordinate frame in which data in a localization request is posed. The system may actively align the data in the localization request with an estimated direction of gravity during the localization processing, and/or a portable device may establish a coordinate frame in which the data in the localization request is posed aligned with an estimated direction of gravity such that the subsequently acquired data for inclusion in a localization request, when posed in that coordinate frame, is passively aligned with the estimated direction of gravity.
    Type: Grant
    Filed: April 28, 2021
    Date of Patent: February 13, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Javier Victorio Gomez Gonzalez, Miguel Andres Granados Velasquez, Mukta Prasad, Dominik Michael Kasper, Eran Guendelman, Keng-Sheng Lin
  • Patent number: 11823450
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Grant
    Filed: October 14, 2022
    Date of Patent: November 21, 2023
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Publication number: 20230034363
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Application
    Filed: October 14, 2022
    Publication date: February 2, 2023
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Patent number: 11501529
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Grant
    Filed: March 5, 2021
    Date of Patent: November 15, 2022
    Assignee: MAGIC LEAP, INC.
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Publication number: 20220230382
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the preintegration term, and the wearable head device presents the virtual content based on the position of the device.
    Type: Application
    Filed: April 7, 2022
    Publication date: July 21, 2022
    Inventors: Yu-Hsiang HUANG, Evan Gregory LEVINE, Igor NAPOLSKIKH, Dominik Michael KASPER, Manel Quim SANCHEZ NICUESA, Sergiu SIMA, Benjamin LANGMANN, Ashwin SWAMINATHAN, Martin Georg ZAHNERT, Blazej Marek CZUPRYNSKI, Joao Antonio Pereira FARO, Christoph TOBLER, Omid GHASEMALIZADEH
  • Patent number: 11328475
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a first preintegration term and second preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the first and second preintegration terms, and the wearable head device presents the virtual content based on the position of the device.
    Type: Grant
    Filed: October 16, 2020
    Date of Patent: May 10, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Yu-Hsiang Huang, Evan Gregory Levine, Igor Napolskikh, Dominik Michael Kasper, Manel Quim Sanchez Nicuesa, Sergiu Sima, Benjamin Langmann, Ashwin Swaminathan, Martin Georg Zahnert, Blazej Marek Czuprynski, Joao Antonio Pereira Faro, Christoph Tobler, Omid Ghasemalizadeh
  • Publication number: 20210343087
    Abstract: A cross reality system enables portable devices to access stored maps and efficiently and accurately render virtual content specified in relation to those maps. The system may process images acquired with a portable device to quickly and accurately localize the portable device to the persisted maps by constraining the result of localization based on the estimated direction of gravity of a persisted map and the coordinate frame in which data in a localization request is posed. The system may actively align the data in the localization request with an estimated direction of gravity during the localization processing, and/or a portable device may establish a coordinate frame in which the data in the localization request is posed aligned with an estimated direction of gravity such that the subsequently acquired data for inclusion in a localization request, when posed in that coordinate frame, is passively aligned with the estimated direction of gravity.
    Type: Application
    Filed: April 28, 2021
    Publication date: November 4, 2021
    Applicant: Magic Leap, Inc.
    Inventors: Javier Victorio Gomez Gonzalez, Miguel Andres Granados Velasquez, Mukta Prasad, Dominik Michael Kasper, Eran Guendelman, Keng-Sheng Lin
  • Publication number: 20210334537
    Abstract: To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.
    Type: Application
    Filed: March 5, 2021
    Publication date: October 28, 2021
    Inventors: Martin Georg Zahnert, Joao Antonio Pereira Faro, Miguel Andres Granados Velasquez, Dominik Michael Kasper, Ashwin Swaminathan, Anush Mohan, Prateek Singhal
  • Publication number: 20210118218
    Abstract: Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a first preintegration term and second preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the first and second preintegration terms, and the wearable head device presents the virtual content based on the position of the device.
    Type: Application
    Filed: October 16, 2020
    Publication date: April 22, 2021
    Inventors: Yu-Hsiang HUANG, Evan Gregory LEVINE, Igor NAPOLSKIKH, Dominik Michael KASPER, Manel Quim SANCHEZ NICUESA, Sergiu SIMA, Benjamin LANGMANN, Ashwin SWAMINATHAN, Martin Georg ZAHNERT, Blazej Marek CZUPRYNSKI, Joao Antonio Pereira FARO, Christoph TOBLER, Omid GHASEMALIZADEH