Patents by Inventor David Charles Lundmark

David Charles Lundmark has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210297808
    Abstract: This disclosure describes techniques for receiving information that is wirelessly transmitted to a mobile computing system by wireless devices that are proximal to a route being travelled by the mobile computing system, and presenting at least a portion of the received information through a display of the mobile computing system. The information can be displayed according to a computing experience that is determined for a user of the mobile computing system (e.g., user-selected, inferred based on a stored schedule of the user, and so forth). Different sets of location-based information can be transmitted to the mobile computing system from different wireless devices that the mobile computing system comes into proximity with while traveling along a route. In some instances, the information can be locally stored on the wireless device(s) to reduce latency.
    Type: Application
    Filed: July 3, 2019
    Publication date: September 23, 2021
    Applicant: Magic Leap, Inc.
    Inventor: David Charles LUNDMARK
  • Publication number: 20210287382
    Abstract: An apparatus for providing a virtual content in an environment which first and second users can interact with each other, comprising: a communication interface configured to communicate with a first display screen worn by the first user and/or a second display screen worn by the second user; and a processing unit configured to: obtain a first position of the first user, determine a first set of anchor point(s) based on the first position of the first user, obtain a second position of the second user, determine a second set of anchor point(s) based on the second position of the second user, determine one or more common anchor points that are in both the first set and the second set, and provide the virtual content for experience by the first user and/or the second user based on at least one of the one or more common anchor points.
    Type: Application
    Filed: March 12, 2021
    Publication date: September 16, 2021
    Applicant: MAGIC LEAP, INC.
    Inventors: Daniel LeWinn LEHRICH, Marissa Jean TRAIN, David Charles LUNDMARK
  • Publication number: 20210177370
    Abstract: A method of viewing a patient including inserting a catheter is described for health procedure navigation. A CT scan is carried out on a body part of a patient. Raw data from the CT scan is processed to create three-dimensional image data, storing the image data in the data store. Projectors receive generated light in a pattern representative of the image data and waveguides guide the light to a retina of an eye of a viewer while light from an external surface of the body transmits to the retina of the eye so that the viewer sees the external surface of the body augmented with the processed data rendering of the body part.
    Type: Application
    Filed: August 22, 2019
    Publication date: June 17, 2021
    Applicant: Magic Leap, Inc.
    Inventors: Nastasja U ROBAINA, Praveen BABU J D, David Charles LUNDMARK, Alexander ILIC
  • Publication number: 20210151010
    Abstract: An apparatus configured to be head-worn by a user, includes: a transparent screen configured to allow the user to see therethrough; a sensor system configured to sense a characteristic of a physical object in an environment in which the user is located; and a processing unit coupled to the sensor system, the processing unit configured to: cause the screen to display a user-controllable object, and cause the screen to display an image of a feature that is resulted from a virtual interaction between the user-controllable object and the physical object, so that the feature will appear to be a part of the physical object in the environment or appear to be emanating from the physical object.
    Type: Application
    Filed: November 11, 2020
    Publication date: May 20, 2021
    Applicant: MAGIC LEAP, INC.
    Inventors: David Charles LUNDMARK, Gregory Michael BROADMORE
  • Publication number: 20210145525
    Abstract: The invention relates to a viewing system for use in a surgical environment. Various real object detection devices detect locations of real objects in a real environment, such as a patient and body part of patient, medical staff, robots, a cutting tool on a robot, implant transferred by robot into body part, surgical tools, and disposable items. A map generator generates a map that forms a digital representation or a digital twin of the real environment. Various guiding modules including a room setup module, an anatomy registration module, a surgical planning module, and a surgical execution module make use of the digital representation to guide virtual or real objects based on the digital representation.
    Type: Application
    Filed: November 13, 2020
    Publication date: May 20, 2021
    Applicant: Magic Leap, Inc.
    Inventors: Jennifer Miglionico ESPOSITO, Daniel Andrew GARCIA, Manuela TAMAYO, Emilio Patrick SHIRONOSHITA, David Charles LUNDMARK
  • Publication number: 20210056764
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, totem, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The wearable system can detect when different inputs converge together, such as when a user seeks to select a virtual object using multiple inputs such as eye gaze, head pose, hand gesture, and totem input. Upon detecting an input convergence, the wearable system can perform a transmodal filtering scheme that leverages the converged inputs to assist in properly interpreting what command the user is providing or what object the user is targeting.
    Type: Application
    Filed: November 5, 2020
    Publication date: February 25, 2021
    Inventors: Paul Lacey, Samuel A. Miller, Nicholas Atkinson Kramer, David Charles Lundmark
  • Patent number: 10930076
    Abstract: Systems and methods for matching content elements to surfaces in a spatially organized 3D environment. The method includes receiving content, identifying one or more elements in the content, determining one or more surfaces, matching the one or more elements to the one or more surfaces, and displaying the one or more elements as virtual content onto the one or more surfaces.
    Type: Grant
    Filed: May 1, 2018
    Date of Patent: February 23, 2021
    Assignee: Magic Leap, Inc.
    Inventors: Denys Bastov, Victor Ng-Thow-Hing, Benjamin Zaaron Reinhardt, Leonid Zolotarev, Yannick Pellet, Aleksei Marchenko, Brian Everett Meaney, Marc Coleman Shelton, Megan Ann Geiman, John A. Gotcher, Matthew Schon Bogue, Shivakumar Balasubramanyam, Jeffrey Edward Ruediger, David Charles Lundmark
  • Publication number: 20210048972
    Abstract: A computer implemented method of facilitating communication between first and second users includes displaying, by a first head-worn device, a first virtual object to the first user first user wearing the first head-worn device. The method also includes displaying, by a second head-worn device, a second virtual object to the second user wearing the second head-worn device. The method further includes facilitating, by the first and second head-worn devices, communications between the first and second users using the first and second virtual objects to simulate the first and second users being present in a common environment.
    Type: Application
    Filed: August 11, 2020
    Publication date: February 18, 2021
    Applicant: MAGIC LEAP, INC.
    Inventors: Rodolpho C. CARDENUTO, David Charles LUNDMARK
  • Patent number: 10861242
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, totem, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The wearable system can detect when different inputs converge together, such as when a user seeks to select a virtual object using multiple inputs such as eye gaze, head pose, hand gesture, and totem input. Upon detecting an input convergence, the wearable system can perform a transmodal filtering scheme that leverages the converged inputs to assist in properly interpreting what command the user is providing or what object the user is targeting.
    Type: Grant
    Filed: May 21, 2019
    Date of Patent: December 8, 2020
    Assignee: Magic Leap, Inc.
    Inventors: Paul Lacey, Samuel A. Miller, Nicholas Atkinson Kramer, David Charles Lundmark
  • Publication number: 20190391391
    Abstract: An apparatus for use with an image display device configured for head-worn by a user, includes: a screen; and a processing unit configured to assign a first area of the screen to sense finger-action of the user; wherein the processing unit is configured to generate an electronic signal to cause a change in a content displayed by the display device based on the finger-action of the user sensed by the assigned first area of the screen of the apparatus.
    Type: Application
    Filed: June 21, 2019
    Publication date: December 26, 2019
    Applicant: MAGIC LEAP, INC.
    Inventors: Lorena PAZMINO, Andrea Isabel MONTOYA, Savannah NILES, Alexander ROCHA, Mario Antonio BRAGG, Parag GOEL, Jeffrey Scott SOMMERS, David Charles LUNDMARK
  • Publication number: 20190362557
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, totem, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The wearable system can detect when different inputs converge together, such as when a user seeks to select a virtual object using multiple inputs such as eye gaze, head pose, hand gesture, and totem input. Upon detecting an input convergence, the wearable system can perform a transmodal filtering scheme that leverages the converged inputs to assist in properly interpreting what command the user is providing or what object the user is targeting.
    Type: Application
    Filed: May 21, 2019
    Publication date: November 28, 2019
    Inventors: Paul Lacey, Samuel A. Miller, Nicholas Atkinson Kramer, David Charles Lundmark
  • Publication number: 20180315248
    Abstract: Systems and methods for matching content elements to surfaces in a spatially organized 3D environment. The method includes receiving content, identifying one or more elements in the content, determining one or more surfaces, matching the one or more elements to the one or more surfaces, and displaying the one or more elements as virtual content onto the one or more surfaces.
    Type: Application
    Filed: May 1, 2018
    Publication date: November 1, 2018
    Applicant: Magic Leap, Inc.
    Inventors: Denys Bastov, Victor Ng-Thow-Hing, Benjamin Zaaron Reinhardt, Leonid Zolotarev, Yannick Pellet, Aleksei Marchenko, Brian Everett Meaney, Marc Coleman Shelton, Megan Ann Geiman, John A. Gotcher, Matthew Schon Bogue, Shivakumar Balasubramanyam, Jeffrey Edward Ruediger, David Charles Lundmark