Patents by Inventor Etienne Pinchon

Etienne Pinchon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11086475
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device configured to capture image data representative of a physical environment, a head-mounted display (HMD) configured to output artificial reality content, a gesture detector configured to identify, from the image data, a gesture comprising a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that a thumb of the hand and at least one other finger form approximately a circle or approximately a circular segment, a user interface (UI) engine to generate a UI element in response to the identified gesture, and a rendering engine to render the UI element as an overlay to the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: August 10, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 11086406
    Abstract: A hand interaction system can use a three-state model to differentiate between normal hand movements, such as reaching for an object, and hand input gestures. The three-state model can specify a sequence of states including: 1) a neutral state, 2) a tracking state, and 3) an active state. In the neutral state, the hand interaction system monitors for a gesture signaling a transition to the tracking state but does not otherwise interpret a gesture corresponding to the active state as input. Once a gesture causes a transition to the intermediate tracking state, the hand interaction system can recognize a further active state transition gesture, allowing active state interaction. Thus, the monitoring for the intermediate tracking state provides a gating mechanism, making it less likely that the hand interaction system will interpret hand movements as input when not so intended by the user.
    Type: Grant
    Filed: September 20, 2019
    Date of Patent: August 10, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones
  • Publication number: 20210209860
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: March 24, 2021
    Publication date: July 8, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Patent number: 11054896
    Abstract: A user views images through a head-mounted display (HMD). The HMD can display virtual interaction objects to a user on a reference plane, such as a keyboard and trackpad. The reference plane can be defined by the user by forming predetermined hand shapes with their hands. After the reference plane is generated, the HMD can display virtual interaction objects on the reference plane. To interact with the virtual objects, the HMD can track the hands of the user relative to the reference plane. For example, a finger tap on a virtual trackpad on the reference plane can indicate a click. Among other advantages, the reference plane is defined based on hand shapes of the user. Thus, the reference plane can be determined without the HMD identifying a surface in the real world (e.g., through object recognition).
    Type: Grant
    Filed: February 7, 2019
    Date of Patent: July 6, 2021
    Assignee: Facebook, Inc.
    Inventors: Bojan Wilytsch, Etienne Pinchon
  • Patent number: 11043192
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: June 22, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 11003307
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device is configured to capture image data representative of a physical environment. The HMD is configured to output artificial reality content including a representation of a wrist. The rendering engine configured to render a user interface (UI) element. The gesture detector configured to identify a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at the location of the UI element, and a pulling motion away from the wrist while in the gripping configuration.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: May 11, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10990240
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs artificial reality content. The artificial reality system renders a container that includes application content items as an overlay to the artificial reality content. The artificial reality system identifies, from the image data, a selection gesture comprising a configuration of a hand that is substantially stationary for a threshold period of time at a first location corresponding to a first application content item within the container, and a subsequent movement of the hand from the first location to a second location outside the container. The artificial reality system renders the first application content item at the second location in response.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: April 27, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10991163
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Grant
    Filed: September 25, 2019
    Date of Patent: April 27, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Publication number: 20210090337
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: September 25, 2019
    Publication date: March 25, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Publication number: 20210090333
    Abstract: A progressive display system can compute a virtual distance between a user and virtual objects. The virtual distance can be based on: a distance between the user and an object, a viewing angle of the object, and/or a footprint of the object in a field of view. The progressive display system can determine where the virtual distance falls in a sequence of distance ranges that correspond to levels of detail. Using a mapping between content sets for the object and levels of detail that correspond to distance ranges, the progressive display system can select content sets to display in relation to the object. As the user moves, the virtual distance will move across thresholds bounding the distance ranges. This causes the progressive display system to select and display other content sets for the distance range in which the current virtual distance falls.
    Type: Application
    Filed: September 20, 2019
    Publication date: March 25, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones
  • Publication number: 20210090331
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: September 20, 2019
    Publication date: March 25, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Publication number: 20210090341
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: September 9, 2020
    Publication date: March 25, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Publication number: 20210090332
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: September 20, 2019
    Publication date: March 25, 2021
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Patent number: 10955929
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs the artificial reality content. The artificial reality system identifies, from the image data, a gesture comprising a motion of a first digit of a hand and a second digit of the hand to form a pinching configuration a particular number of times within a threshold amount of time. The artificial reality system assigns one or more input characters to one or more of a plurality of digits of the hand and processes a selection of a first input character of the one or more input characters assigned to the second digit of the hand in response to the identified gesture.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: March 23, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10921879
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content, the artificial reality content including an assistant element. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at a location that corresponds to the assistant element, and subsequent to the gripping motion, a throwing motion of the hand with respect to the assistant element. The UI engine generates a UI element in response to identifying the gesture.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: February 16, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Patent number: 10890983
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system can include a menu that can be activated and interacted with using one hand. In response to detecting a menu activation gesture performed using one hand, the artificial reality system can cause a menu to be rendered. A menu sliding gesture (e.g., horizontal motion) of the hand can be used to cause a slidably engageable user interface (UI) element to move along a horizontal dimension of the menu while horizontal positioning of the UI menu is held constant. Motion of the hand orthogonal to the menu sliding gesture (e.g., non-horizontal motion) can cause the menu to be repositioned. The implementation of the artificial reality system does require use of both hands or use of other input devices in order to interact with the artificial reality system.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: January 12, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20200387214
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment, renders artificial reality content and a virtual keyboard with a plurality of virtual keys as an overlay to the artificial reality content, and outputs the artificial reality content and the virtual keyboard. The artificial reality system identifies, from the image data, a gesture comprising a first digit of a hand being brought in contact with a second digit of the hand, wherein a point of the contact corresponds to a location of a first virtual key of the plurality of virtual keys of the virtual keyboard. The artificial reality system processes a selection of the first virtual key in response to the identified gesture.
    Type: Application
    Filed: June 7, 2019
    Publication date: December 10, 2020
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20200388247
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.
    Type: Application
    Filed: June 7, 2019
    Publication date: December 10, 2020
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20200387213
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content, the artificial reality content including an assistant element. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at a location that corresponds to the assistant element, and subsequent to the gripping motion, a throwing motion of the hand with respect to the assistant element. The UI engine generates a UI element in response to identifying the gesture.
    Type: Application
    Filed: June 7, 2019
    Publication date: December 10, 2020
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20200387229
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs the artificial reality content. The artificial reality system identifies, from the image data, a gesture comprising a motion of a first digit of a hand and a second digit of the hand to form a pinching configuration a particular number of times within a threshold amount of time. The artificial reality system assigns one or more input characters to one or more of a plurality of digits of the hand and processes a selection of a first input character of the one or more input characters assigned to the second digit of the hand in response to the identified gesture.
    Type: Application
    Filed: June 7, 2019
    Publication date: December 10, 2020
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox