Patents by Inventor John Bevis

John Bevis has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10613642
    Abstract: Embodiments are disclosed herein that relate to tuning gesture recognition characteristics for a device configured to receive gesture-based user inputs. For example, one disclosed embodiment provides a head-mounted display device including a plurality of sensors, a display configured to present a user interface, a logic machine, and a storage machine that holds instructions executable by the logic machine to detect a gesture based upon information received from a first sensor of the plurality of sensors, perform an action in response to detecting the gesture, and determine whether the gesture matches an intended gesture input. The instructions are further executable to update a gesture parameter that defines the intended gesture input if it is determined that the gesture detected does not match the intended gesture input.
    Type: Grant
    Filed: March 12, 2014
    Date of Patent: April 7, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michael Scavezze, Adam G. Poulos, John Bevis, Jeremy Lee, Daniel Joseph McCulloch, Nicholas Gervase Fajt
  • Patent number: 10222981
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received from a capture device. The user's hand is spaced by an initial actual distance from the capture device, and a holographic keyboard image is displayed spatially separated by a virtual distance from a holographic hand image. The user's hand is determined to move to an updated actual distance from the capture device. In response, the holographic keyboard image is maintained spatially separated by substantially the virtual distance from the holographic hand image.
    Type: Grant
    Filed: September 6, 2017
    Date of Patent: March 5, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Publication number: 20170364261
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received from a capture device. The user's hand is spaced by an initial actual distance from the capture device, and a holographic keyboard image is displayed spatially separated by a virtual distance from a holographic hand image. The user's hand is determined to move to an updated actual distance from the capture device. In response, the holographic keyboard image is maintained spatially separated by substantially the virtual distance from the holographic hand image.
    Type: Application
    Filed: September 6, 2017
    Publication date: December 21, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Patent number: 9766806
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received. Using the depth information, a holographic hand image representing the user's hand is displayed in a virtual hand plane in the holographic environment. In response to receiving a keyboard activation input from the user and using the depth information, the holographic keyboard image is adaptively displayed in a virtual keyboard plane in the holographic environment at a virtual distance under the holographic hand image representing the user's hand.
    Type: Grant
    Filed: July 15, 2014
    Date of Patent: September 19, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Patent number: 9761057
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: November 21, 2016
    Date of Patent: September 12, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20170069143
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: November 21, 2016
    Publication date: March 9, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9563331
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Grant
    Filed: June 28, 2013
    Date of Patent: February 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Patent number: 9501873
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: July 22, 2015
    Date of Patent: November 22, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9390561
    Abstract: Methods for generating and displaying personalized virtual billboards within an augmented reality environment are described. The personalized virtual billboards may facilitate the sharing of personalized information between persons within an environment who have varying degrees of acquaintance (e.g., ranging from close familial relationships to strangers). In some embodiments, a head-mounted display device (HMD) may detect a mobile device associated with a particular person within an environment, acquire a personalized information set corresponding with the particular person, generate a virtual billboard based on the personalized information set, and display the virtual billboard on the HMD. The personalized information set may include information associated with the particular person such as shopping lists and classified advertisements.
    Type: Grant
    Filed: April 12, 2013
    Date of Patent: July 12, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Cameron G. Brown, Abby Lee, Brian J. Mount, Daniel J. McCulloch, Michael J. Scavezze, Ryan L. Hastings, John Bevis, Mike Thomas, Ron Amador-Leon
  • Publication number: 20160018985
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received. Using the depth information, a holographic hand image representing the user's hand is displayed in a virtual hand plane in the holographic environment. In response to receiving a keyboard activation input from the user and using the depth information, the holographic keyboard image is adaptively displayed in a virtual keyboard plane in the holographic environment at a virtual distance under the holographic hand image representing the user's hand.
    Type: Application
    Filed: July 15, 2014
    Publication date: January 21, 2016
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Publication number: 20150325054
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: July 22, 2015
    Publication date: November 12, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20150261318
    Abstract: Embodiments are disclosed herein that relate to tuning gesture recognition characteristics for a device configured to receive gesture-based user inputs. For example, one disclosed embodiment provides a head-mounted display device including a plurality of sensors, a display configured to present a user interface, a logic machine, and a storage machine that holds instructions executable by the logic machine to detect a gesture based upon information received from a first sensor of the plurality of sensors, perform an action in response to detecting the gesture, and determine whether the gesture matches an intended gesture input. The instructions are further executable to update a gesture parameter that defines the intended gesture input if it is determined that the gesture detected does not match the intended gesture input.
    Type: Application
    Filed: March 12, 2014
    Publication date: September 17, 2015
    Inventors: Michael Scavezze, Adam G. Poulos, John Bevis, Jeremy Lee, Daniel Joseph McCulloch, Nicholas Gervase Fajt
  • Patent number: 9129430
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: September 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20150007114
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Publication number: 20140375683
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20140306994
    Abstract: Methods for generating and displaying personalized virtual billboards within an augmented reality environment are described. The personalized virtual billboards may facilitate the sharing of personalized information between persons within an environment who have varying degrees of acquaintance (e.g., ranging from close familial relationships to strangers). In some embodiments, a head-mounted display device (HMD) may detect a mobile device associated with a particular person within an environment, acquire a personalized information set corresponding with the particular person, generate a virtual billboard based on the personalized information set, and display the virtual billboard on the HMD. The personalized information set may include information associated with the particular person such as shopping lists and classified advertisements.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 16, 2014
    Inventors: Cameron G. Brown, Abby Lee, Brian J. Mount, Daniel J. McCulloch, Michael J. Scavezze, Ryan L. Hastings, John Bevis, Mike Thomas, Ron Amador-Leon
  • Publication number: 20140240351
    Abstract: Embodiments that relate to providing motion amplification to a virtual environment are disclosed. For example, in one disclosed embodiment a mixed reality augmentation program receives from a head-mounted display device motion data that corresponds to motion of a user in a physical environment. The program presents via the display device the virtual environment in motion in a principal direction, with the principal direction motion being amplified by a first multiplier as compared to the motion of the user in a corresponding principal direction. The program also presents the virtual environment in motion in a secondary direction, where the secondary direction motion is amplified by a second multiplier as compared to the motion of the user in a corresponding secondary direction, and the second multiplier is less than the first multiplier.
    Type: Application
    Filed: February 27, 2013
    Publication date: August 28, 2014
    Inventors: Michael Scavezze, Nicholas Gervase Fajt, Arnulfo Zepeda Navratil, Jason Scott, Adam Benjamin Smith-Kipnis, Brian Mount, John Bevis, Cameron Brown, Tony Ambrus, Phillip Charles Heckinger, Dan Kroymann, Matthew G. Kaplan, Aaron Krauss
  • Patent number: 6745818
    Abstract: A method and apparatus for converting liquid alloy into its thixotropic state and for fabricating high integrity components by injecting subsequently the thixotropic alloy into a die cavity. The apparatus includes a liquid metal feeder, a high shear twin-screw extruder, a shot assembly and a central control system. The apparatus and method can offer net-shaped components characterized by close to zero porosity, fine and equiaxed particles with a uniform distribution in the eutectic matrix, and a large range of solid volume fractions.
    Type: Grant
    Filed: August 5, 2002
    Date of Patent: June 8, 2004
    Assignee: Brunel University
    Inventors: Zhongyun Fan, Michael John Bevis, Shouxun Ji
  • Publication number: 20040089437
    Abstract: A method and apparatus are provided for fabricating of continuous castings with fine and uniform microstructure, which can be used as feedstock for secondary processing routes, such as thixoforming, forging and machining or direct application in industry. An overheated liquid alloy is fed into a high shear device (for example, a twin-screw extruder) and sheared intensively to produce a sheared liquid alloy or a semisolid slurry, wherein the sheared liquid alloy is at a temperature close to its liquidus and the semisolid slurry is then transferred to a shaping device for production of continuous castings with fine and uniform microstructures through a solidification process. The shaping device is any device capable of forming continuous (i.e.
    Type: Application
    Filed: June 27, 2003
    Publication date: May 13, 2004
    Inventors: Zhongyung Fan, Shouxun Ji, Michael John Bevis
  • Patent number: D791341
    Type: Grant
    Filed: July 8, 2015
    Date of Patent: July 4, 2017
    Assignee: Architectural Metalworks Australia
    Inventor: Samuel John Bevis