Patents by Inventor Kyle Mouritsen
Kyle Mouritsen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240402795Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: July 12, 2024Publication date: December 5, 2024Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Patent number: 12067159Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: GrantFiled: December 27, 2021Date of Patent: August 20, 2024Assignee: Microsoft Technology Licensing, LLC.Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
-
Publication number: 20240168542Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: January 22, 2024Publication date: May 23, 2024Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Patent number: 11914759Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: GrantFiled: January 19, 2022Date of Patent: February 27, 2024Assignee: Microsoft Technology Licensing, LLC.Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
-
Publication number: 20230137920Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: January 19, 2022Publication date: May 4, 2023Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Publication number: 20230135974Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: December 27, 2021Publication date: May 4, 2023Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren BENNETT
-
Patent number: 11137874Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. While the operator is navigated to a step of the task, the mixed-reality view is caused to include a step card, such that the step card includes at least one instruction associated with the step. The operator is enabled to adjust a state associated with the step card. While the state associated with the step card is a first state: a gaze determination associated with a gaze of the operator is made; and responsive to a positive gaze determination, the step card is caused to move to a location that is associated with a real-world location of the gaze of the operator.Type: GrantFiled: May 28, 2019Date of Patent: October 5, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Andrew Jackson Klein, Ethan Harris Arnowitz, Kevin Thomas Mather, Kyle Mouritsen
-
Patent number: 11010016Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. The holographic aspects include at least a first hologram. The first hologram is displayed with a first hologram menu, such that the first hologram menu includes visible selectable menu options associated with the first hologram. In the mixed-reality view, the first hologram menu, including the visible selectable menu options of the first hologram menu, is rotated to face the operator as the operator moves, such that the first hologram menu rotates in a curved manner about the first hologram as the first hologram menu rotates. Responsive to the operator making a selection on the first hologram menu, the first hologram in the mixed-reality view is altered based on the selection.Type: GrantFiled: May 7, 2020Date of Patent: May 18, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Victor Jia, Kevin Thomas Mather, Robert István Butterworth, Kyle Mouritsen
-
Publication number: 20200273252Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. While the operator is navigated to a step of the task, the mixed-reality view is caused to include a step card, such that the step card includes at least one instruction associated with the step. The operator is enabled to adjust a state associated with the step card. While the state associated with the step card is a first state: a gaze determination associated with a gaze of the operator is made; and responsive to a positive gaze determination, the step card is caused to move to a location that is associated with a real-world location of the gaze of the operator.Type: ApplicationFiled: May 28, 2019Publication date: August 27, 2020Inventors: Andrew Jackson KLEIN, Ethan Harris ARNOWITZ, Kevin Thomas MATHER, Kyle MOURITSEN
-
Publication number: 20200272303Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. The holographic aspects include at least a first hologram. The first hologram is displayed with a first hologram menu, such that the first hologram menu includes visible selectable menu options associated with the first hologram. In the mixed-reality view, the first hologram menu, including the visible selectable menu options of the first hologram menu, is rotated to face the operator as the operator moves, such that the first hologram menu rotates in a curved manner about the first hologram as the first hologram menu rotates. Responsive to the operator making a selection on the first hologram menu, the first hologram in the mixed-reality view is altered based on the selection.Type: ApplicationFiled: May 7, 2020Publication date: August 27, 2020Inventors: Victor JIA, Kevin Thomas MATHER, Robert István BUTTERWORTH, Kyle MOURITSEN
-
Patent number: 10671241Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is provided to an operator. The mixed-reality view includes both a real-world environment of the operator and holographic aspects. The holographic aspects include at least a first hologram. The first hologram is displayed with a first hologram menu, such that the first hologram menu includes visible selectable menu options associated with the first hologram. In the mixed-reality view, the first hologram menu, including the visible selectable menu options of the first hologram menu, is rotated to face the operator as the operator moves, such that the first hologram menu rotates in a curved manner about the first hologram as the first hologram menu rotates. Responsive to the operator making a selection on the first hologram menu, the first hologram in the mixed-reality view is altered based on the selection.Type: GrantFiled: May 30, 2019Date of Patent: June 2, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Victor Jia, Kevin Thomas Mather, Robert István Butterworth, Kyle Mouritsen
-
Patent number: D775652Type: GrantFiled: November 18, 2015Date of Patent: January 3, 2017Assignee: Microsoft CorporationInventors: Peter Duyen Hung Hoang, Joey Hoi-Man Lee, Darren Woo, Jun Ho Moon, Kyle Mouritsen, Shannon Yen Yun Lee, Helen Lam, Preet Mangat
-
Patent number: D789959Type: GrantFiled: August 24, 2015Date of Patent: June 20, 2017Assignee: Microsoft CorporationInventors: Peter Duyen Hung Hoang, Joey Hoi-Man Lee, Darren Woo, Jun Ho Moon, Kyle Mouritsen, Shannon Lee, Helen Lam, Preet Mangat