Patents by Inventor Jeremy Bruce Kersey

Jeremy Bruce Kersey has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11914759
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Grant
    Filed: January 19, 2022
    Date of Patent: February 27, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
  • Publication number: 20240004545
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Application
    Filed: September 15, 2023
    Publication date: January 4, 2024
    Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
  • Patent number: 11797175
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Grant
    Filed: February 2, 2022
    Date of Patent: October 24, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew Jackson Klein, Hendry Effendy, Ethan Harris Arnowitz, Jonathon Burnham Cobb, Melissa Hellmund Vega, Stuart John Mayhew, Jeremy Bruce Kersey
  • Publication number: 20230137920
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Application
    Filed: January 19, 2022
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
  • Publication number: 20230138952
    Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.
    Type: Application
    Filed: February 2, 2022
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
  • Publication number: 20230135974
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Application
    Filed: December 27, 2021
    Publication date: May 4, 2023
    Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren BENNETT
  • Patent number: 10264320
    Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.
    Type: Grant
    Filed: January 21, 2015
    Date of Patent: April 16, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld
  • Publication number: 20150355826
    Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.
    Type: Application
    Filed: January 21, 2015
    Publication date: December 10, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld