Patents by Inventor Chris McKenzie

Chris McKenzie has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10803663
    Abstract: A method for depth sensor aided estimation of virtual reality environment boundaries includes generating depth data at a depth sensor of an electronic device based on a local environment proximate the electronic device. A set of initial boundary data is estimated based on the depth data, wherein the set of initial boundary data defines an exterior boundary of a virtual bounded floor plan. The virtual bounded floor plan is generated based at least in part on the set of initial boundary data. Additionally, a relative pose of the electronic device within the virtual bounded floor plan is determined and a collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Grant
    Filed: August 2, 2017
    Date of Patent: October 13, 2020
    Assignee: GOOGLE LLC
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Patent number: 10795449
    Abstract: Methods and apparatus using gestures to share private windows in shared virtual environments are disclosed herein. An example method includes detecting a gesture of a user in a virtual environment associated with a private window in the virtual environment, the private window associated with the user, determining whether the gesture represents a signal to share the private window with another, and, when the gesture represents a signal to share the private window, changing the status of the private window to a shared window.
    Type: Grant
    Filed: December 9, 2016
    Date of Patent: October 6, 2020
    Assignee: GOOGLE LLC
    Inventors: Alexander James Faaborg, Chris McKenzie
  • Patent number: 10606344
    Abstract: In a system for dynamic switching and merging of head, gesture and touch input in virtual reality, focus may be set on a first virtual in response to a first input implementing one of a number of different input modes. The first object may then be manipulated in the virtual world in response to a second input implementing another input mode. In response to a third input, focus may be shifted from the first object to a second object if, for example, a priority value of the third input is higher than a priority value of the first input. If the priority value of the third input is less than that of the first input, focus may remain on the first object. In response to certain trigger inputs, a display of virtual objects may be shifted between a far field display and a near field display to accommodate a particular mode of interaction with the virtual objects.
    Type: Grant
    Filed: September 13, 2018
    Date of Patent: March 31, 2020
    Assignee: GOOGLE LLC
    Inventors: Alexander James Faaborg, Manuel Christian Clement, Chris McKenzie
  • Patent number: 10496156
    Abstract: A system and method of operating an audio visual system generating an immersive virtual experience may include generating, by a head-mounted audio visual device, a virtual world immersive experience within a virtual space while physically moving within a physical space, displaying, by the head-mounted audio visual device within the virtual space, a visual target marker indicating a target location in the physical space, receiving, by the head-mounted audio visual device, a teleport control signal, and moving a virtual location of the head-mounted audio visual device within the virtual space from a first virtual location to a second virtual location in response to receiving the teleport control signal.
    Type: Grant
    Filed: May 15, 2017
    Date of Patent: December 3, 2019
    Assignee: GOOGLE LLC
    Inventors: Robbie Tilton, Robert Carl Jagnow, Alexander James Faaborg, Chris McKenzie
  • Patent number: 10347053
    Abstract: Techniques disclosed herein involve adaptively or dynamically displaying virtual objects in a virtual reality (VR) environment, and representations, within the VR environment, of physical objects in the physical environment, i.e., outside the VR environment, in order to alert users within the VR environment. For example, if a projected movement of a user indicates that the user will move close to a physical object in the physical world, the representation of the physical object changes from an un-displayed state, in which the physical object is not visible in the VR environment, to a displayed state in which the physical object is at least partially depicted inside the VR environment. In this way, what is displayed inside the VR environment can include both virtual objects as well representations of physical objects from the physical space.
    Type: Grant
    Filed: May 17, 2017
    Date of Patent: July 9, 2019
    Assignee: GOOGLE LLC
    Inventors: Chris McKenzie, Adam Glazier, Clayton Woodward Bavor, Jr.
  • Patent number: 10334076
    Abstract: In a system for pairing a first device and a second device in a virtual reality environment, the first device may be a sending device, and the second device may be a receiving device. The sending device may transmit an electromagnetic signal that is received by the receiving device. The receiving device may process the electromagnetic signal to verify physical proximity of the receiving device and the transmitting device, and extract identification information related to the sending device for pairing. The receiving device may display one or more virtual pairing indicators to be manipulated to verify the user's intention to pair the first and second devices.
    Type: Grant
    Filed: February 21, 2017
    Date of Patent: June 25, 2019
    Assignee: GOOGLE LLC
    Inventors: Chris McKenzie, Murphy Stein, Alan Browning
  • Publication number: 20190179497
    Abstract: Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a computing device, such as head-mountable device (HMD). The UI allows a user of the HMD to navigate through a timeline of ordered screens or cards shown on the graphic display of the HMD. The cards on the timeline may be chronologically ordered based on times associated with each card. Numerous cards may be added to the timeline such that a user may scroll through the timeline to search for a specific card. The HMD may be configured to group cards on the timeline. The cards may be grouped by multiple time periods and by various content types within each respective time period. The cards may also be grouped based on durations between the present/on-going time period and each respective time period.
    Type: Application
    Filed: February 14, 2019
    Publication date: June 13, 2019
    Inventors: Chris McKenzie, Antonio Bernardo Monteiro Costa, Richard Dennis The
  • Patent number: 10275023
    Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
    Type: Grant
    Filed: December 21, 2016
    Date of Patent: April 30, 2019
    Assignee: GOOGLE LLC
    Inventors: Chris McKenzie, Chun Yat Frank Li, Hayes S. Raffle
  • Patent number: 10254923
    Abstract: Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a computing device, such as head-mountable device (HMD). The UI allows a user of the HMD to navigate through a timeline of ordered screens or cards shown on the graphic display of the HMD. The cards on the timeline may be chronologically ordered based on times associated with each card. Numerous cards may be added to the timeline such that a user may scroll through the timeline to search for a specific card. The HMD may be configured to group cards on the timeline. The cards may be grouped by multiple time periods and by various content types within each respective time period. The cards may also be grouped based on durations between the present/on-going time period and each respective time period.
    Type: Grant
    Filed: December 18, 2015
    Date of Patent: April 9, 2019
    Assignee: Google LLC
    Inventors: Chris McKenzie, Antonio Bernardo Monteiro Costa, Richard Dennis The
  • Publication number: 20190043259
    Abstract: A method for depth sensor aided estimation of virtual reality environment boundaries includes generating depth data at a depth sensor of an electronic device based on a local environment proximate the electronic device. A set of initial boundary data is estimated based on the depth data, wherein the set of initial boundary data defines an exterior boundary of a virtual bounded floor plan. The virtual bounded floor plan is generated based at least in part on the set of initial boundary data. Additionally, a relative pose of the electronic device within the virtual bounded floor plan is determined and a collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Application
    Filed: August 2, 2017
    Publication date: February 7, 2019
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Publication number: 20190033989
    Abstract: A method for generating virtual reality environment boundaries includes receiving, at a depth sensor of an electronic device, depth data from a local environment proximate the electronic device. Further, a set of outer boundary data is received defines an exterior boundary of a virtual bounded floor plan. A virtual bounded floor plan is generated based at least in part on the set of outer boundary data and the depth data from the local environment. Further, a relative pose of the electronic device within the virtual bounded floor plan is determined and collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Application
    Filed: July 31, 2017
    Publication date: January 31, 2019
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Publication number: 20190011979
    Abstract: In a system for dynamic switching and merging of head, gesture and touch input in virtual reality, focus may be set on a first virtual in response to a first input implementing one of a number of different input modes. The first object may then be manipulated in the virtual world in response to a second input implementing another input mode. In response to a third input, focus may be shifted from the first object to a second object if, for example, a priority value of the third input is higher than a priority value of the first input. If the priority value of the third input is less than that of the first input, focus may remain on the first object. In response to certain trigger inputs, a display of virtual objects may be shifted between a far field display and a near field display to accommodate a particular mode of interaction with the virtual objects.
    Type: Application
    Filed: September 13, 2018
    Publication date: January 10, 2019
    Inventors: Alexander James Faaborg, Manuel Christian Clement, Chris McKenzie
  • Patent number: 10101803
    Abstract: In a system for dynamic switching and merging of head, gesture and touch input in virtual reality, a virtual object may be selected by a user in response to a first input implementing one of a number of different input modes. Once selected, with focus established on the first object by the first input, the first object may be manipulated in the virtual world in response to a second input implementing another of the different input modes. In response to a third input, another object may be selected, and focus may be shifted from the first object to the second object in response to a third input if, for example, a priority value of the third input is higher than a priority value of the first input that established focus on the first object. If the priority value of the third input is less than the priority value of the first input that established focus on the first object, focus may remain on the first object.
    Type: Grant
    Filed: August 26, 2015
    Date of Patent: October 16, 2018
    Assignee: Google LLC
    Inventors: Alexander James Faaborg, Manuel Christian Clement, Chris McKenzie
  • Publication number: 20170337750
    Abstract: Techniques disclosed herein involve adaptively or dynamically displaying virtual objects in a virtual reality (VR) environment, and representations, within the VR environment, of physical objects in the physical environment, i.e., outside the VR environment, in order to alert users within the VR environment. For example, if a projected movement of a user indicates that the user will move close to a physical object in the physical world, the representation of the physical object changes from an un-displayed state, in which the physical object is not visible in the VR environment, to a displayed state in which the physical object is at least partially depicted inside the VR environment. In this way, what is displayed inside the VR environment can include both virtual objects as well representations of physical objects from the physical space.
    Type: Application
    Filed: May 17, 2017
    Publication date: November 23, 2017
    Inventors: Chris McKenzie, Adam Glazier, Clayton Woodward Bavor, JR.
  • Publication number: 20170336863
    Abstract: A system and method of operating an audio visual system generating an immersive virtual experience may include generating, by a head-mounted audio visual device, a virtual world immersive experience within a virtual space while physically moving within a physical space, displaying, by the head-mounted audio visual device within the virtual space, a visual target marker indicating a target location in the physical space, receiving, by the head-mounted audio visual device, a teleport control signal, and moving a virtual location of the head-mounted audio visual device within the virtual space from a first virtual location to a second virtual location in response to receiving the teleport control signal.
    Type: Application
    Filed: May 15, 2017
    Publication date: November 23, 2017
    Inventors: Robbie TILTON, Robert Carl JAGNOW, Alexander James FAABORG, Chris MCKENZIE
  • Publication number: 20170322623
    Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
    Type: Application
    Filed: December 21, 2016
    Publication date: November 9, 2017
    Inventors: Chris McKenzie, Chun Yat Frank Li, Hayes S. Raffle
  • Publication number: 20170244811
    Abstract: In a system for pairing a first device and a second device in a virtual reality environment, the first device may be a sending device, and the second device may be a receiving device. The sending device may transmit an electromagnetic signal that is received by the receiving device. The receiving device may process the electromagnetic signal to verify physical proximity of the receiving device and the transmitting device, and extract identification information related to the sending device for pairing. The receiving device may display one or more virtual pairing indicators to be manipulated to verify the user's intention to pair the first and second devices.
    Type: Application
    Filed: February 21, 2017
    Publication date: August 24, 2017
    Inventors: Chris MCKENZIE, Murphy STEIN, Alan BROWNING
  • Publication number: 20170168585
    Abstract: Methods and apparatus using gestures to share private windows in shared virtual environments are disclosed herein. An example method includes detecting a gesture of a user in a virtual environment associated with a private window in the virtual environment, the private window associated with the user, determining whether the gesture represents a signal to share the private window with another, and, when the gesture represents a signal to share the private window, changing the status of the private window to a shared window.
    Type: Application
    Filed: December 9, 2016
    Publication date: June 15, 2017
    Inventors: Alexander James FAABORG, Chris McKENZIE
  • Publication number: 20170103574
    Abstract: A system and method of operating an audio visual system generating an immersive virtual experience may detect when a user approaches a physical boundary of a real world space, and may generate an alert indicating the proximity of the physical boundary. Activity in and interaction with the immersive virtual experience may be temporarily paused as the user completes a physical re-orientation in the real world space in response to the alert. Upon detection of completion of the physical re-orientation in the real world space, activity in and interaction with the immersive virtual experience may resume at the point at which activity was temporarily paused. This may provide for relatively continuous movement in the immersive virtual experience within the boundaries of the real world space.
    Type: Application
    Filed: October 13, 2015
    Publication date: April 13, 2017
    Inventors: Alexander James Faaborg, Chris McKenzie, Adam Glazier
  • Patent number: 9607440
    Abstract: In one aspect, an HMD is disclosed that provides a technique for generating a composite image representing the view of a wearer of the HMD. The HMD may include a display and a front-facing camera, and may be configured to perform certain functions. For instance, the HMD may be configured to make a determination that a trigger event occurred and responsively both generate a first image that is indicative of content displayed on the display, and cause the camera to capture a second image that is indicative of a real-world field-of-view associated with the HMD. Further, the HMD may be configured to generate a composite image that combines the generated first image and the captured second image.
    Type: Grant
    Filed: September 7, 2016
    Date of Patent: March 28, 2017
    Assignee: Google Inc.
    Inventors: Richard The, Max Benjamin Braun, Chris McKenzie, Alexander Hanbing Chen