Patents by Inventor Jonathan Ravasz
Jonathan Ravasz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220244834Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: ApplicationFiled: April 12, 2022Publication date: August 4, 2022Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220229524Abstract: In some embodiments, an electronic device selectively performs operations in response to user inputs depending on whether the inputs are preceded by detecting a ready state. In some embodiments, an electronic device processes user inputs based on an attention zone associated with the user. In some embodiments, an electronic device enhances interactions with user interface elements at different distances and/or angles with respect to a gaze of a user. In some embodiments, an electronic device enhances interactions with user interface elements for mixed direct and indirect interaction modes. In some embodiments, an electronic device manages inputs from two of the user's hands and/or presents visual indications of user inputs. In some embodiments, an electronic device enhances interactions with user interface elements in a three-dimensional environment using visual indications of such interactions. In some embodiments, an electronic device redirects a selection input from one user interface element to another.Type: ApplicationFiled: January 20, 2022Publication date: July 21, 2022Inventors: Christopher D. MCKENZIE, Pol PLA I CONESA, Stephen O. LEMAY, William A. SORRENTINO, III, Shih-Sang CHIU, Jonathan RAVASZ, Benjamin Hunter BOESEL, Kristi E. BAUERLY
-
Patent number: 11334212Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: May 17, 2022Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220130121Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: ApplicationFiled: January 10, 2022Publication date: April 28, 2022Inventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
-
Publication number: 20220121344Abstract: In some embodiments, an electronic device enhances interactions with virtual objects in a three-dimensional environment. In some embodiments, an electronic device enhances interactions with selectable user interface elements. In some embodiments, an electronic device enhances interactions with slider user interface elements. In some embodiments, an electronic device moves virtual objects in a three-dimensional environment and facilitates accessing actions associated with virtual objects.Type: ApplicationFiled: September 25, 2021Publication date: April 21, 2022Inventors: Israel PASTRANA VICENTE, Jonathan R. DASCOLA, Wesley M. HOLDER, Alexis Henri PALANGIE, Aaron Mackay BURNS, Pol PLA I CONESA, William A. SORRENTINO, III, Stephen O. LEMAY, Christopher D. MCKENZIE, Shih-Sang Chiu, Benjamin Hunter Boesel, Jonathan Ravasz
-
Patent number: 11257295Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: March 24, 2021Date of Patent: February 22, 2022Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11232643Abstract: The present embodiments relate to generating 3D objects in an artificial reality environment and collapsing 3D objects into 2D images representing the 3D objects. Users operating extra reality (XR) devices controlling the artificial reality environment can collaboratively create or modify content in the artificial reality environment, using real-world creation objects (such as a hand of the user or a pen held by the user) to create 3D objects. In response to a user triggering a collapse for a 3D object, a 2D image of the 3D object can created, from the user's perspective. The 2D image can replace the 3D object in the artificial reality environment. Presenting the 2D image can reduce the amount of data, processing resources, and power needed to provide the artificial reality environment while also reducing clutter and cognitive load on the user.Type: GrantFiled: December 22, 2020Date of Patent: January 25, 2022Assignee: Facebook Technologies, LLCInventors: Jasper Stevens, Etienne Pinchon, Jonathan Ravasz, Evgenii Krivoruchko, Wai Leong Chak
-
Patent number: 11189099Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 20, 2019Date of Patent: November 30, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11176745Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 20, 2019Date of Patent: November 16, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11170576Abstract: A progressive display system can compute a virtual distance between a user and virtual objects. The virtual distance can be based on: a distance between the user and an object, a viewing angle of the object, and/or a footprint of the object in a field of view. The progressive display system can determine where the virtual distance falls in a sequence of distance ranges that correspond to levels of detail. Using a mapping between content sets for the object and levels of detail that correspond to distance ranges, the progressive display system can select content sets to display in relation to the object. As the user moves, the virtual distance will move across thresholds bounding the distance ranges. This causes the progressive display system to select and display other content sets for the distance range in which the current virtual distance falls.Type: GrantFiled: September 20, 2019Date of Patent: November 9, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones
-
Patent number: 11086475Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device configured to capture image data representative of a physical environment, a head-mounted display (HMD) configured to output artificial reality content, a gesture detector configured to identify, from the image data, a gesture comprising a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that a thumb of the hand and at least one other finger form approximately a circle or approximately a circular segment, a user interface (UI) engine to generate a UI element in response to the identified gesture, and a rendering engine to render the UI element as an overlay to the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: August 10, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11086406Abstract: A hand interaction system can use a three-state model to differentiate between normal hand movements, such as reaching for an object, and hand input gestures. The three-state model can specify a sequence of states including: 1) a neutral state, 2) a tracking state, and 3) an active state. In the neutral state, the hand interaction system monitors for a gesture signaling a transition to the tracking state but does not otherwise interpret a gesture corresponding to the active state as input. Once a gesture causes a transition to the intermediate tracking state, the hand interaction system can recognize a further active state transition gesture, allowing active state interaction. Thus, the monitoring for the intermediate tracking state provides a gating mechanism, making it less likely that the hand interaction system will interpret hand movements as input when not so intended by the user.Type: GrantFiled: September 20, 2019Date of Patent: August 10, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones
-
Patent number: 11086392Abstract: The disclosed computer-implemented method may include communicatively coupling a user interface device to a virtual reality device, capturing inputs from a user on the user interface device, displaying, on the virtual reality device, a virtual representation of the captured inputs from the user, and mirroring, on the virtual reality device, content presented on the user interface device. Various other methods, systems, and computer-readable media are also disclosed.Type: GrantFiled: April 9, 2019Date of Patent: August 10, 2021Assignee: Facebook Technologies, LLCInventors: Sebastian Sztuk, Omar John Pualuan, Jeffrey Witthuhn, Nabeel Farooq Butt, Jonathan Ravasz, Simon Tickner, Robert Ellis, Kayvon Asemani
-
Publication number: 20210209860Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: ApplicationFiled: March 24, 2021Publication date: July 8, 2021Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Patent number: 11043192Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: June 22, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11003307Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device is configured to capture image data representative of a physical environment. The HMD is configured to output artificial reality content including a representation of a wrist. The rendering engine configured to render a user interface (UI) element. The gesture detector configured to identify a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at the location of the UI element, and a pulling motion away from the wrist while in the gripping configuration.Type: GrantFiled: June 7, 2019Date of Patent: May 11, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10990240Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs artificial reality content. The artificial reality system renders a container that includes application content items as an overlay to the artificial reality content. The artificial reality system identifies, from the image data, a selection gesture comprising a configuration of a hand that is substantially stationary for a threshold period of time at a first location corresponding to a first application content item within the container, and a subsequent movement of the hand from the first location to a second location outside the container. The artificial reality system renders the first application content item at the second location in response.Type: GrantFiled: June 7, 2019Date of Patent: April 27, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10991163Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: GrantFiled: September 25, 2019Date of Patent: April 27, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
-
Publication number: 20210090333Abstract: A progressive display system can compute a virtual distance between a user and virtual objects. The virtual distance can be based on: a distance between the user and an object, a viewing angle of the object, and/or a footprint of the object in a field of view. The progressive display system can determine where the virtual distance falls in a sequence of distance ranges that correspond to levels of detail. Using a mapping between content sets for the object and levels of detail that correspond to distance ranges, the progressive display system can select content sets to display in relation to the object. As the user moves, the virtual distance will move across thresholds bounding the distance ranges. This causes the progressive display system to select and display other content sets for the distance range in which the current virtual distance falls.Type: ApplicationFiled: September 20, 2019Publication date: March 25, 2021Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones
-
Publication number: 20210090337Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.Type: ApplicationFiled: September 25, 2019Publication date: March 25, 2021Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko