Patents by Inventor Jeremy Bruce Kersey
Jeremy Bruce Kersey has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240402795Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: July 12, 2024Publication date: December 5, 2024Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Patent number: 12086407Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.Type: GrantFiled: September 15, 2023Date of Patent: September 10, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Andrew Jackson Klein, Hendry Effendy, Ethan Harris Arnowitz, Jonathon Burnham Cobb, Melissa Hellmund Vega, Stuart John Mayhew, Jeremy Bruce Kersey
-
Patent number: 12067159Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: GrantFiled: December 27, 2021Date of Patent: August 20, 2024Assignee: Microsoft Technology Licensing, LLC.Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
-
Publication number: 20240168542Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: January 22, 2024Publication date: May 23, 2024Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Patent number: 11914759Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: GrantFiled: January 19, 2022Date of Patent: February 27, 2024Assignee: Microsoft Technology Licensing, LLC.Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
-
Publication number: 20240004545Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.Type: ApplicationFiled: September 15, 2023Publication date: January 4, 2024Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
-
Patent number: 11797175Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.Type: GrantFiled: February 2, 2022Date of Patent: October 24, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Andrew Jackson Klein, Hendry Effendy, Ethan Harris Arnowitz, Jonathon Burnham Cobb, Melissa Hellmund Vega, Stuart John Mayhew, Jeremy Bruce Kersey
-
Publication number: 20230138952Abstract: Systems and methods for attaching a virtual input device to a virtual object in a mixed reality (MR) environment are provided. The system includes a memory, a processor communicatively coupled to the memory, and a display device. The display device is configured to display a MR environment provided by at least one application implemented by the processor. The mixed reality environment includes a virtual object corresponding to an application, and a virtual input device. The at least one application docks the virtual input device to the virtual object with an offset relative to the virtual object.Type: ApplicationFiled: February 2, 2022Publication date: May 4, 2023Inventors: Andrew Jackson KLEIN, Hendry EFFENDY, Ethan Harris ARNOWITZ, Jonathon Burnham COBB, Melissa HELLMUND VEGA, Stuart John MAYHEW, Jeremy Bruce KERSEY
-
Publication number: 20230137920Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: January 19, 2022Publication date: May 4, 2023Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren A. BENNETT
-
Publication number: 20230135974Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.Type: ApplicationFiled: December 27, 2021Publication date: May 4, 2023Inventors: Andrew Jackson KLEIN, Cory Ryan BRAMALL, Kyle MOURITSEN, Ethan Harris ARNOWITZ, Jeremy Bruce KERSEY, Victor JIA, Justin Thomas SAVINO, Stephen Michael LUCAS, Darren BENNETT
-
Patent number: 10264320Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.Type: GrantFiled: January 21, 2015Date of Patent: April 16, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld
-
Publication number: 20150355826Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.Type: ApplicationFiled: January 21, 2015Publication date: December 10, 2015Applicant: MICROSOFT CORPORATIONInventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld