Patents by Inventor Barrett Fox

Barrett Fox has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240094860
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: February 24, 2023
    Publication date: March 21, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11847753
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: January 9, 2023
    Date of Patent: December 19, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11769304
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: November 9, 2021
    Date of Patent: September 26, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11651573
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: October 12, 2021
    Date of Patent: May 16, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11599237
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: February 12, 2021
    Date of Patent: March 7, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11422669
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises a head-mounted display configured to output artificial reality content; a stylus; a stylus action detector configured to detect movement of the stylus, detect a stylus selection action, and after detecting the stylus selection action, detect further movement of the stylus; a UI engine configured to generate stylus movement content in response to detecting movement of the stylus, and generate a UI input element in response to detecting the stylus selection action; and a rendering engine configured to render the stylus movement content and the UI input element as overlays to the artificial reality content, and update the stylus movement content based on the further movement of the stylus.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: August 23, 2022
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20220244834
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.
    Type: Application
    Filed: April 12, 2022
    Publication date: August 4, 2022
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 11334212
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: May 17, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20220122329
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Application
    Filed: October 12, 2021
    Publication date: April 21, 2022
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Publication number: 20220068035
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Application
    Filed: November 9, 2021
    Publication date: March 3, 2022
    Inventors: James TICHENOR, Arthur ZWIEGINCEW, Hayden Schoen, Alex MARCOLINA, Gregory ALT, Todd HARRIS, Merlyn DENG, Barrett FOX, Michal HLAVAC
  • Patent number: 11227445
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: January 18, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11176755
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: November 16, 2021
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11086475
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device configured to capture image data representative of a physical environment, a head-mounted display (HMD) configured to output artificial reality content, a gesture detector configured to identify, from the image data, a gesture comprising a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that a thumb of the hand and at least one other finger form approximately a circle or approximately a circular segment, a user interface (UI) engine to generate a UI element in response to the identified gesture, and a rendering engine to render the UI element as an overlay to the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: August 10, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 11043192
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: June 22, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20210165555
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: February 12, 2021
    Publication date: June 3, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11003307
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device is configured to capture image data representative of a physical environment. The HMD is configured to output artificial reality content including a representation of a wrist. The rendering engine configured to render a user interface (UI) element. The gesture detector configured to identify a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at the location of the UI element, and a pulling motion away from the wrist while in the gripping configuration.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: May 11, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10990240
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs artificial reality content. The artificial reality system renders a container that includes application content items as an overlay to the artificial reality content. The artificial reality system identifies, from the image data, a selection gesture comprising a configuration of a hand that is substantially stationary for a threshold period of time at a first location corresponding to a first application content item within the container, and a subsequent movement of the hand from the first location to a second location outside the container. The artificial reality system renders the first application content item at the second location in response.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: April 27, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10955929
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs the artificial reality content. The artificial reality system identifies, from the image data, a gesture comprising a motion of a first digit of a hand and a second digit of the hand to form a pinching configuration a particular number of times within a threshold amount of time. The artificial reality system assigns one or more input characters to one or more of a plurality of digits of the hand and processes a selection of a first input character of the one or more input characters assigned to the second digit of the hand in response to the identified gesture.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: March 23, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10921949
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: July 12, 2019
    Date of Patent: February 16, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 10921879
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content, the artificial reality content including an assistant element. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at a location that corresponds to the assistant element, and subsequent to the gripping motion, a throwing motion of the hand with respect to the assistant element. The UI engine generates a UI element in response to identifying the gesture.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: February 16, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox