Patents by Inventor Etienne Pinchon
Etienne Pinchon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11947111Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 2, 2022Date of Patent: April 2, 2024Assignee: Meta Platforms Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11836828Abstract: The disclosed computer-implemented method may include generating a virtual item within a virtual environment. The method may also include detecting, using various hardware sensors, a current position of a physical object that is to be portrayed within the virtual environment. The method may next include generating a virtual representation of the physical object within the virtual environment. The virtual representation of the physical object may be configured to at least partially follow movements of the physical object relative to the virtual item. The method may also include presenting the virtual item and the generated virtual representation of the physical object within the virtual environment, where the virtual representation of the physical object is at least partially, controllably decoupled from the movements of the physical object relative to the virtual item. Various other methods, systems, and computer-readable media are also disclosed.Type: GrantFiled: October 29, 2021Date of Patent: December 5, 2023Assignee: Meta Platforms Technologies, LLCInventors: Brandon Furtwangler, Norah Riley Smith, Andrew C Johnson, Etienne Pinchon, Samuel Matthew Levatich, Matthew Alan Insley, Jason Charles Leimgruber, Jennifer Lynn Spurlock, Michael Medlock
-
Publication number: 20230140550Abstract: The disclosed computer-implemented method may include generating a virtual item within a virtual environment. The method may also include detecting, using various hardware sensors, a current position of a physical object that is to be portrayed within the virtual environment. The method may next include generating a virtual representation of the physical object within the virtual environment. The virtual representation of the physical object may be configured to at least partially follow movements of the physical object relative to the virtual item. The method may also include presenting the virtual item and the generated virtual representation of the physical object within the virtual environment, where the virtual representation of the physical object is at least partially, controllably decoupled from the movements of the physical object relative to the virtual item. Various other methods, systems, and computer-readable media are also disclosed.Type: ApplicationFiled: October 29, 2021Publication date: May 4, 2023Inventors: Brandon Furtwangler, Norah Riley Smith, Andrew C. Johnson, Etienne Pinchon, Samuel Matthew Levatich, Matthew Alan Insley, Jason Charles Leimgruber, Jennifer Lynn Spurlock, Michael Medlock
-
Patent number: 11556172Abstract: Methods and systems described herein are directed to viewpoint coordination of multiple users on artificial reality collaboration models. Multiple users can collaborate on 3D models in an artificial reality environment. Attention indicators can indicate to the other users where a user is looking or where the user is pointing. In some implementations, there is a central model that each user sees from her own perspective (e.g., in the center of a virtual table all users are seated around) and/or instances of the model that only an individual user sees or can manipulate. The user's attention indicators can be based on where the user is looking or pointing on the model instances. Additionally, attention indicators can be translated from a controlling user's perspective to a coordinate system defined by the model, which allows the indicators to be then presented in that model coordinate system for all users.Type: GrantFiled: December 22, 2020Date of Patent: January 17, 2023Assignee: Meta Platforms Technologies, LLCInventors: Jasper Stevens, Etienne Pinchon
-
Publication number: 20230011453Abstract: Aspects of the present disclosure are directed to a teleportation system for artificial reality. The teleportation system can recognize a user making a first gesture (e.g., palm-facing-down fist with thumb out) and in response, can set a first origin point at the location of the first gesture. As the user holds the hand pose of the first gestures and moves it relative to the first origin point, a potential destination point is set according to the distance and direction between the hand pose of the first gestures and the first origin point. Thus, the angle and distance of the user's hand from the first origin point can control the direction and distance of a potential teleport destination. In some cases, the first gesture can include a user's fingers curled into a fist with the thumb tip away from the fingers.Type: ApplicationFiled: July 7, 2021Publication date: January 12, 2023Inventor: Etienne PINCHON
-
Publication number: 20220414999Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: ApplicationFiled: September 2, 2022Publication date: December 29, 2022Applicant: Meta Platforms Technologies, LLCInventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
-
Patent number: 11468644Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 9, 2020Date of Patent: October 11, 2022Assignee: Meta Platforms Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11461973Abstract: Embodiments described herein disclose methods and systems directed to locomotion in virtual reality (VR) based on hand gestures of a user. In some implementations, the user can navigate between locations using hand gestures that trigger teleportation. In other implementations, the user can separately control forward/backward movement and the direction orientation the user is facing. The separate control can either be with one hand when making different gestures or by using different hands to control movement and orientation. In some implementations, a dragging gesture by the user is interpreted by the VR system to trigger movement. In this way, the user can turn in place without moving forward, move forward without turning, or can move forward and turn, while controlling speed, all with a single gesture.Type: GrantFiled: December 22, 2020Date of Patent: October 4, 2022Assignee: Meta Platforms Technologies, LLCInventor: Etienne Pinchon
-
Patent number: 11422669Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises a head-mounted display configured to output artificial reality content; a stylus; a stylus action detector configured to detect movement of the stylus, detect a stylus selection action, and after detecting the stylus selection action, detect further movement of the stylus; a UI engine configured to generate stylus movement content in response to detecting movement of the stylus, and generate a UI input element in response to detecting the stylus selection action; and a rendering engine configured to render the stylus movement content and the UI input element as overlays to the artificial reality content, and update the stylus movement content based on the further movement of the stylus.Type: GrantFiled: June 7, 2019Date of Patent: August 23, 2022Assignee: FACEBOOK TECHNOLOGIES, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220261088Abstract: The disclosed technology can perform application controls in response to recognizing particular gestures. The disclosed technology can provide a launcher with virtual objects displayed in categories (e.g., history, pinned favorites, people, and a search area). The disclosed technology can perform a clone and configure input pattern, which clones a source virtual object into one or more cloned virtual objects with alternate configuration properties. The disclosed technology can perform a page or peel input pattern, which allows users to page between grids of virtual objects and facilitates peeling items out of the grid. The disclosed technology can perform a clutter and clear input pattern, which can expand multiple elements into individual views, while clearing other virtual objects.Type: ApplicationFiled: May 9, 2022Publication date: August 18, 2022Applicant: Facebook Technologies, LLCInventors: Etienne PINCHON, John Jacob BLAKELEY, Michal HLAVAC, Jasper STEVENS
-
Publication number: 20220244834Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: ApplicationFiled: April 12, 2022Publication date: August 4, 2022Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220198755Abstract: Embodiments described herein disclose methods and systems directed to locomotion in virtual reality (VR) based on hand gestures of a user. In some implementations, the user can navigate between locations using hand gestures that trigger teleportation. In other implementations, the user can separately control forward/backward movement and the direction orientation the user is facing. The separate control can either be with one hand when making different gestures or by using different hands to control movement and orientation. In some implementations, a dragging gesture by the user is interpreted by the VR system to trigger movement. In this way, the user can turn in place without moving forward, move forward without turning, or can move forward and turn, while controlling speed, all with a single gesture.Type: ApplicationFiled: December 22, 2020Publication date: June 23, 2022Inventor: Etienne Pinchon
-
Patent number: 11334212Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: May 17, 2022Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220130121Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: ApplicationFiled: January 10, 2022Publication date: April 28, 2022Inventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
-
Patent number: 11294475Abstract: Embodiments described herein disclose methods and systems directed to input mode selection in artificial reality. In some implementations, various input modes enable a user to perform precise interactions with a target object without occluding the target object. Some input modes can include rays that extend along a line that intersects an origin point, a control point, and an interaction point. An interaction model can specify when the system switches between input modes, such as modes based solely on gaze, using long or short ray input, or with direct interaction between the user's hand(s) and objects. These transitions can be performed by evaluating rules that take context factors such as whether a user's hands are in view of the user, what posture the hands are in, whether a target object is selected, and whether a target object is within a threshold distance from the user.Type: GrantFiled: February 8, 2021Date of Patent: April 5, 2022Assignee: Facebook Technologies, LLCInventors: Etienne Pinchon, Jennifer Lynn Spurlock, Nathan Aschenbach, Gerrit Hendrik Hofmeester, Roger Ibars Martinez, Christopher Alan Baker, Chris Rojas
-
Patent number: 11257295Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: March 24, 2021Date of Patent: February 22, 2022Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11232643Abstract: The present embodiments relate to generating 3D objects in an artificial reality environment and collapsing 3D objects into 2D images representing the 3D objects. Users operating extra reality (XR) devices controlling the artificial reality environment can collaboratively create or modify content in the artificial reality environment, using real-world creation objects (such as a hand of the user or a pen held by the user) to create 3D objects. In response to a user triggering a collapse for a 3D object, a 2D image of the 3D object can created, from the user's perspective. The 2D image can replace the 3D object in the artificial reality environment. Presenting the 2D image can reduce the amount of data, processing resources, and power needed to provide the artificial reality environment while also reducing clutter and cognitive load on the user.Type: GrantFiled: December 22, 2020Date of Patent: January 25, 2022Assignee: Facebook Technologies, LLCInventors: Jasper Stevens, Etienne Pinchon, Jonathan Ravasz, Evgenii Krivoruchko, Wai Leong Chak
-
Patent number: 11189099Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 20, 2019Date of Patent: November 30, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11176745Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 20, 2019Date of Patent: November 16, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11170576Abstract: A progressive display system can compute a virtual distance between a user and virtual objects. The virtual distance can be based on: a distance between the user and an object, a viewing angle of the object, and/or a footprint of the object in a field of view. The progressive display system can determine where the virtual distance falls in a sequence of distance ranges that correspond to levels of detail. Using a mapping between content sets for the object and levels of detail that correspond to distance ranges, the progressive display system can select content sets to display in relation to the object. As the user moves, the virtual distance will move across thresholds bounding the distance ranges. This causes the progressive display system to select and display other content sets for the distance range in which the current virtual distance falls.Type: GrantFiled: September 20, 2019Date of Patent: November 9, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones