Patents by Inventor Andrew Jackson KLEIN

Andrew Jackson KLEIN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11914759
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Grant
    Filed: January 19, 2022
    Date of Patent: February 27, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
  • Publication number: 20240004545
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Application
    Filed: September 15, 2023
    Publication date: January 4, 2024
    Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
  • Patent number: 11797175
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Grant
    Filed: February 2, 2022
    Date of Patent: October 24, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew Jackson Klein, Hendry Effendy, Ethan Harris Arnowitz, Jonathon Burnham Cobb, Melissa Hellmund Vega, Stuart John Mayhew, Jeremy Bruce Kersey
  • Publication number: 20230137920
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Application
    Filed: January 19, 2022
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
  • Publication number: 20230135974
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Application
    Filed: December 27, 2021
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren BENNETT
  • Publication number: 20230138952
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Application
    Filed: February 2, 2022
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
  • Patent number: 11467709
    Abstract: The disclosed technology is generally directed to mixed-reality devices. In one example of the technology, a first mixed-reality guide is provided mixed-reality devices, enabling the mixed-reality devices to operate the first mixed-reality guide while providing a mixed-reality view, such that: while the first mixed-reality guide is navigated to a step of the set of steps of the first mixed-reality guide, the mixed-reality view includes a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. From each mixed-reality device, mixed-reality data is received based on use of at least the first mixed-reality guide on the mixed-reality device. The mixed-reality data includes spatial telemetry data collected for at least one step of the first mixed-reality guide. A presentation that is based on the mixed-reality data is provided. The first mixed-reality guide is enabled to be altered based on the mixed-reality data.
    Type: Grant
    Filed: February 7, 2020
    Date of Patent: October 11, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alexandre Pierre Michel Godin, Andrew Jackson Klein, Arni Mar Thrastarson, Charla Marie Pereira, Cydney Brooke Nielsen, Darren Alexander Bennett, Jason Drew Vantomme, Joel Jamon Rendon, Kjartan Olafsson, Mahesh Keshav Kamat, Maya Alethea Miller-Vedam, Ryan Martin Nadel, Robert István Butterworth
  • Publication number: 20220157026
    Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.
    Type: Application
    Filed: July 23, 2021
    Publication date: May 19, 2022
    Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
  • Patent number: 11137874
    Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. While the operator is navigated to a step of the task, the mixed-reality view is caused to include a step card, such that the step card includes at least one instruction associated with the step. The operator is enabled to adjust a state associated with the step card. While the state associated with the step card is a first state: a gaze determination associated with a gaze of the operator is made; and responsive to a positive gaze determination, the step card is caused to move to a location that is associated with a real-world location of the gaze of the operator.
    Type: Grant
    Filed: May 28, 2019
    Date of Patent: October 5, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew Jackson Klein, Ethan Harris Arnowitz, Kevin Thomas Mather, Kyle Mouritsen
  • Patent number: 11126319
    Abstract: The disclosed technology is generally directed to mixed reality, augmented reality, and/or virtual reality devices. In one example of the technology, a first hologram is caused to be displayed to an operator with a first gaze selection area that is associated with a first selectable option. A gaze location that is associated with a gaze of the operator is evaluated. Responsive to the gaze location coinciding with the first gaze selection area of the first hologram, a first gaze dwell timer is begun. A total duration of the first gaze dwell timer is adjusted based on the gaze location being in a specific portion of the first gaze selection area. Responsive to the gaze dwell timer finishing, the first selectable option is caused to be selected.
    Type: Grant
    Filed: August 2, 2019
    Date of Patent: September 21, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew Jackson Klein, Ethan Harris Arnowitz, Cory Ryan Bramall, Victor Jia
  • Patent number: 11087548
    Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.
    Type: Grant
    Filed: September 26, 2019
    Date of Patent: August 10, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
  • Patent number: 10936146
    Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator, wherein the mixed-reality view includes both a real-world environment of the operator and holographic aspects. The operator is enabled to navigate among a plurality of steps of a task, such that for at least one step of the plurality of steps of the task, while the operator is navigated to the step of the task: the mixed-reality view is caused to include at least one instruction associated with the step. The mixed-reality view is caused to include a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. The mixed-reality view is caused to continually include a visual tether from the instruction to the real-world location.
    Type: Grant
    Filed: May 30, 2019
    Date of Patent: March 2, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Darren Alexander Bennett, Charla Marie Pereira, Andrew Jackson Klein, Robert István Butterworth, Sean Robert Puller, Tsz Fung Wan, Kevin Thomas Mather, Dean Alan Wadsworth
  • Patent number: 10824294
    Abstract: Various methods and systems, for implementing three-dimensional resource integration, are provided. 3D resource integration includes integration of 3D resources into different types of functionality, such as, operating system, file explorer, application and augmented reality functionality. In operation, an indication to perform an operation with a 3D object is received. One or more 3D resource controls, associated with the operation, are accessed. The 3D resource control is a defined set of instructions on how to integrate 3D resources with 3D objects for generating 3D-based graphical interfaces associated with application features and operating system features. An input based on one or more control elements of the one or more 3D resource controls is received. The input includes the one or more control elements that operate to generate a 3D-based graphical interface for the operation. Based on receiving the input, the operation is executed with the 3D object and the 3D-based graphical interface.
    Type: Grant
    Filed: May 16, 2017
    Date of Patent: November 3, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Enrico William Guld, Jason A. Carter, Heather Joanne Alekson, Andrew Jackson Klein, David J. W. Seymour, Kathleen P. Mulcahy, Charla M. Pereira, Evan Lewis Jones, William Axel Olsen, Adam Roy Mitchell, Daniel Lee Osborn, Zachary D. Wiesnoski, Struan Andrew Robertson, Michael Edward Harnisch, William Robert Schnurr, Helen Joan Hem Lam, Darren Alexander Bennett, Kin Hang Chu
  • Publication number: 20200273255
    Abstract: The disclosed technology is generally directed to mixed-reality devices. In one example of the technology, a first mixed-reality guide is provided mixed-reality devices, enabling the mixed-reality devices to operate the first mixed-reality guide while providing a mixed-reality view, such that: while the first mixed-reality guide is navigated to a step of the set of steps of the first mixed-reality guide, the mixed-reality view includes a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. From each mixed-reality device, mixed-reality data is received based on use of at least the first mixed-reality guide on the mixed-reality device. The mixed-reality data includes spatial telemetry data collected for at least one step of the first mixed-reality guide. A presentation that is based on the mixed-reality data is provided. The first mixed-reality guide is enabled to be altered based on the mixed-reality data.
    Type: Application
    Filed: February 7, 2020
    Publication date: August 27, 2020
    Inventors: Alexandre Pierre Michel GODIN, Andrew Jackson KLEIN, Arni Mar THRASTARSON, Charla Marie PEREIRA, Cydney Brooke NIELSEN, Darren Alexander BENNETT, Jason Drew VANTOMME, Joel Jamon RENDON, Kjartan OLAFSSON, Mahesh Keshav KAMAT, Maya Alethea MILLER-VEDAM, Ryan Martin NADEL, Robert István BUTTERWORTH
  • Publication number: 20200273254
    Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator, wherein the mixed-reality view includes both a real-world environment of the operator and holographic aspects. The operator is enabled to navigate among a plurality of steps of a task, such that for at least one step of the plurality of steps of the task, while the operator is navigated to the step of the task: the mixed-reality view is caused to include at least one instruction associated with the step. The mixed-reality view is caused to include a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. The mixed-reality view is caused to continually include a visual tether from the instruction to the real-world location.
    Type: Application
    Filed: May 30, 2019
    Publication date: August 27, 2020
    Inventors: Darren Alexander BENNETT, Charla Marie PEREIRA, Andrew Jackson KLEIN, Robert István BUTTERWORTH, Sean Robert PULLER, Tsz Fung WAN, Kevin Thomas MATHER, Dean Alan WADSWORTH
  • Publication number: 20200273252
    Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. While the operator is navigated to a step of the task, the mixed-reality view is caused to include a step card, such that the step card includes at least one instruction associated with the step. The operator is enabled to adjust a state associated with the step card. While the state associated with the step card is a first state: a gaze determination associated with a gaze of the operator is made; and responsive to a positive gaze determination, the step card is caused to move to a location that is associated with a real-world location of the gaze of the operator.
    Type: Application
    Filed: May 28, 2019
    Publication date: August 27, 2020
    Inventors: Andrew Jackson KLEIN, Ethan Harris ARNOWITZ, Kevin Thomas MATHER, Kyle MOURITSEN
  • Publication number: 20200272231
    Abstract: The disclosed technology is generally directed to mixed reality, augmented reality, and/or virtual reality devices. In one example of the technology, a first hologram is caused to be displayed to an operator with a first gaze selection area that is associated with a first selectable option. A gaze location that is associated with a gaze of the operator is evaluated. Responsive to the gaze location coinciding with the first gaze selection area of the first hologram, a first gaze dwell timer is begun. A total duration of the first gaze dwell timer is adjusted based on the gaze location being in a specific portion of the first gaze selection area. Responsive to the gaze dwell timer finishing, the first selectable option is caused to be selected.
    Type: Application
    Filed: August 2, 2019
    Publication date: August 27, 2020
    Inventors: Andrew Jackson KLEIN, Ethan Harris ARNOWITZ, Cory Ryan BRAMALL, Victor JIA
  • Publication number: 20200160604
    Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.
    Type: Application
    Filed: September 26, 2019
    Publication date: May 21, 2020
    Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
  • Patent number: 10438414
    Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.
    Type: Grant
    Filed: January 26, 2018
    Date of Patent: October 8, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
  • Publication number: 20190236842
    Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.
    Type: Application
    Filed: January 26, 2018
    Publication date: August 1, 2019
    Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN