Patents by Inventor Jonathan Ravasz

Jonathan Ravasz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240028177
    Abstract: The present disclosure generally relates to user interfaces for electronic devices, including user interfaces for viewing and interacting with media items.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 25, 2024
    Inventors: Israel PASTRANA VICENTE, Benjamin H. BOESEL, Shih-Sang CHIU, Graham R. CLARKE, Miquel ESTANY RODRIGUEZ, Chia Yang LIN, James J. OWEN, Jonathan RAVASZ, William A. SORRENTINO, III
  • Publication number: 20230384907
    Abstract: In some embodiments, a computer system facilitates manipulation of a three-dimensional environment relative to a viewpoint of a user of the computer system. In some embodiments, a computer system facilitates manipulation of virtual objects in a virtual environment. In some embodiments, a computer system facilitates manipulation of a three-dimensional environment relative to a reference point determined based on attention of a user of the computer system.
    Type: Application
    Filed: April 11, 2023
    Publication date: November 30, 2023
    Inventors: Benjamin H. BOESEL, Jonathan RAVASZ, Shih-Sang CHIU, Jordan A. CAZAMIAS, Stephen O. LEMAY, Christopher D. MCKENZIE, Dorian D. DARGAN, David H. HUANG
  • Publication number: 20230343027
    Abstract: Various implementations disclosed herein include devices, systems, and methods for selecting multiple virtual objects within an environment. In some implementations, a method includes receiving a first gesture associated with a first virtual object in an environment. A movement of the first virtual object in the environment within a threshold distance of a second virtual object in the environment is detected. In response to detecting the movement of the first virtual object in the environment within the threshold distance of the second virtual object in the environment, a concurrent movement of the first virtual object and the second virtual object is displayed in the environment based on the first gesture.
    Type: Application
    Filed: March 20, 2023
    Publication date: October 26, 2023
    Inventors: Jordan A. Cazamias, Aaron M. Burns, David M. Schattel, Jonathan Perron, Jonathan Ravasz, Shih-Sang Chiu
  • Publication number: 20230336865
    Abstract: The present disclosure generally relates to techniques and user interfaces for capturing media, displaying a preview of media, displaying a recording indicator, displaying a camera user interface, and/or displaying previously captured media.
    Type: Application
    Filed: November 22, 2022
    Publication date: October 19, 2023
    Inventors: Alexandre DA VEIGA, Lee S. Broughton, Angel Suet Yan CHEUNG, Stephen O. LEMAY, Chia Yang LIN, Behkish J. MANZARI, Ivan MARKOVIC, Alexander MENZIES, Aaron MORING, Jonathan RAVASZ, Tobias RICK, Bryce L. SCHMIDTCHEN, William A. SORRENTINO, III
  • Publication number: 20230333644
    Abstract: Various implementations disclosed herein include devices, systems, and methods for organizing virtual objects within an environment. In some implementations, a method includes obtaining a user input corresponding to a command to associate a virtual object with a region of an environment. A gaze input corresponding to a user focus location in the region is detected. A movement of the virtual object to an object placement location proximate the user focus location is displayed.
    Type: Application
    Filed: March 20, 2023
    Publication date: October 19, 2023
    Inventors: Jordan A. Cazamias, Aaron M. Burns, David M. Schattel, Jonathan Perron, Jonathan Ravasz, Shih-Sang Chiu
  • Publication number: 20230334724
    Abstract: Various implementations disclosed herein include devices, systems, and methods for determining a placement of virtual objects in a collection of virtual objects when changing from a first viewing arrangement to a second viewing arrangement based on their respective positions in one of the viewing arrangements. In some implementations, a method includes displaying a set of virtual objects in a first viewing arrangement in a first region of an environment. The set of virtual objects are arranged in a first spatial arrangement. A user input corresponding to a request to change to a second viewing arrangement in a second region of the environment is obtained. A mapping is determined between the first spatial arrangement and a second spatial arrangement based on spatial relationships between the set of virtual objects. The set of virtual objects is displayed in the second viewing arrangement in the second region of the environment.
    Type: Application
    Filed: March 20, 2023
    Publication date: October 19, 2023
    Inventors: Jordan A. Cazamias, Aaron M. Burns, David M. Schattel, Jonathan Perron, Jonathan Ravasz, Shih-Sang Chiu
  • Publication number: 20230316634
    Abstract: In some embodiments, a computer system selectively recenters virtual content to a viewpoint of a user, in the presence of physical or virtual obstacles, and/or automatically recenters one or more virtual objects in response to the display generation component changing state, selectively recenters content associated with a communication session between multiple users in response detected user input, changes the visual prominence of content included in virtual objects based on viewpoint and/or based on a detected user attention of a user, modifies visual prominence of one or more virtual objects to resolve apparent obscuring of the one or more virtual objects, modifies visual prominence based on user viewpoint relative to virtual objects, concurrently modifies visual prominence based various types of user interaction, and/or changes an amount of visual impact of an environmental effect in response to detected user input.
    Type: Application
    Filed: January 19, 2023
    Publication date: October 5, 2023
    Inventors: Shih-Sang CHIU, Benjamin H. BOESEL, Jonathan PERRON, Stephen O. LEMAY, Christopher D. MCKENZIE, Dorian D. DARGAN, Jonathan RAVASZ, Nathan GITTER
  • Publication number: 20230154122
    Abstract: In some embodiments, an electronic device automatically updates the orientation of a virtual object in a three-dimensional environment based on a viewpoint of a user in the three-dimensional environment. In some embodiments, an electronic device automatically updates the orientation of a virtual object in a three-dimensional environment based on viewpoints of a plurality of users in the three-dimensional environment. In some embodiments, the electronic device modifies an appearance of a real object that is between a virtual object and the viewpoint of a user in a three-dimensional environment. In some embodiments, the electronic device automatically selects a location for a user in a three-dimensional environment that includes one or more virtual objects and/or other users.
    Type: Application
    Filed: January 13, 2023
    Publication date: May 18, 2023
    Inventors: Jonathan R. DASCOLA, Alexis Henri PALANGIE, Peter D. ANTON, Stephen O. LEMAY, Jonathan RAVASZ, Shi-Sang CHIU, Christopher D. MCKENZIE, Dorian D. DARGAN
  • Publication number: 20230095282
    Abstract: In one implementation, a method for displaying a first pairing affordance that is world-locked to a first peripheral device. The method may be performed by an electronic device including a non-transitory memory, one or more processors, a display, and one or more input devices. The method includes detecting the first peripheral device within a three-dimensional (3D) environment via a computer vision technique. The method includes receiving, via the one or more input devices, a first user input that is directed to the first peripheral device within the 3D environment. The method includes, in response to receiving the first user input, displaying, on the display, the first pairing affordance that is world-locked to the first peripheral device within the 3D environment.
    Type: Application
    Filed: September 22, 2022
    Publication date: March 30, 2023
    Inventors: Benjamin R. Blachnitzky, Aaron M. Burns, Anette L. Freiin von Kapri, Arun Rakesh Yoganandan, Benjamin H. Boesel, Evgenii Krivoruchko, Jonathan Ravasz, Shih-Sang Chiu
  • Publication number: 20230100689
    Abstract: In some embodiments, an electronic device facilitates cursor interactions in different regions in a three-dimensional environment. In some embodiments, an electronic device facilitates cursor interactions in content. In some embodiments, an electronic device facilitates cursor movement. In some embodiments, an electronic device facilitates interaction with multiple input devices. In some embodiments, a computer system facilitates cursor movement based on movement of a hand of a user of the computer system and a location of a gaze of the user in the three-dimensional environment. In some embodiments, a computer system facilitates cursor selection and scrolling of content in the three-dimensional environment.
    Type: Application
    Filed: September 24, 2022
    Publication date: March 30, 2023
    Inventors: Shih-Sang CHIU, Christopher D. MCKENZIE, Pol PLA I CONESA, Jonathan RAVASZ
  • Publication number: 20230092282
    Abstract: In some embodiments, an electronic device uses different algorithms for moving objects in a three-dimensional environment based on the directions of such movements. In some embodiments, an electronic device modifies the size of an object in the three-dimensional environment as the distance between that object and a viewpoint of the user changes. In some embodiments, an electronic device selectively resists movement of an object when that object comes into contact with another object in a three-dimensional environment. In some embodiments, an electronic device selectively adds an object to another object in a three-dimensional environment based on whether the other object is a valid drop target for that object. In some embodiments, an electronic device facilitates movement of multiple objects concurrently in a three-dimensional environment. In some embodiments, an electronic device facilitates throwing of objects in a three-dimensional environment.
    Type: Application
    Filed: September 21, 2022
    Publication date: March 23, 2023
    Inventors: Benjamin H. BOESEL, Shih-Sang CHIU, Jonathan RAVASZ, Trevor J. MCINTYRE, Stephen O. LEMAY, Christopher D. MCKENZIE
  • Publication number: 20220414999
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: September 2, 2022
    Publication date: December 29, 2022
    Applicant: Meta Platforms Technologies, LLC
    Inventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
  • Patent number: 11468644
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Grant
    Filed: September 9, 2020
    Date of Patent: October 11, 2022
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Patent number: 11422669
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises a head-mounted display configured to output artificial reality content; a stylus; a stylus action detector configured to detect movement of the stylus, detect a stylus selection action, and after detecting the stylus selection action, detect further movement of the stylus; a UI engine configured to generate stylus movement content in response to detecting movement of the stylus, and generate a UI input element in response to detecting the stylus selection action; and a rendering engine configured to render the stylus movement content and the UI input element as overlays to the artificial reality content, and update the stylus movement content based on the further movement of the stylus.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: August 23, 2022
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20220244834
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.
    Type: Application
    Filed: April 12, 2022
    Publication date: August 4, 2022
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20220229524
    Abstract: In some embodiments, an electronic device selectively performs operations in response to user inputs depending on whether the inputs are preceded by detecting a ready state. In some embodiments, an electronic device processes user inputs based on an attention zone associated with the user. In some embodiments, an electronic device enhances interactions with user interface elements at different distances and/or angles with respect to a gaze of a user. In some embodiments, an electronic device enhances interactions with user interface elements for mixed direct and indirect interaction modes. In some embodiments, an electronic device manages inputs from two of the user's hands and/or presents visual indications of user inputs. In some embodiments, an electronic device enhances interactions with user interface elements in a three-dimensional environment using visual indications of such interactions. In some embodiments, an electronic device redirects a selection input from one user interface element to another.
    Type: Application
    Filed: January 20, 2022
    Publication date: July 21, 2022
    Inventors: Christopher D. MCKENZIE, Pol PLA I CONESA, Stephen O. LEMAY, William A. SORRENTINO, III, Shih-Sang CHIU, Jonathan RAVASZ, Benjamin Hunter BOESEL, Kristi E. BAUERLY
  • Patent number: 11334212
    Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: May 17, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
  • Publication number: 20220130121
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: January 10, 2022
    Publication date: April 28, 2022
    Inventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
  • Publication number: 20220121344
    Abstract: In some embodiments, an electronic device enhances interactions with virtual objects in a three-dimensional environment. In some embodiments, an electronic device enhances interactions with selectable user interface elements. In some embodiments, an electronic device enhances interactions with slider user interface elements. In some embodiments, an electronic device moves virtual objects in a three-dimensional environment and facilitates accessing actions associated with virtual objects.
    Type: Application
    Filed: September 25, 2021
    Publication date: April 21, 2022
    Inventors: Israel PASTRANA VICENTE, Jonathan R. DASCOLA, Wesley M. HOLDER, Alexis Henri PALANGIE, Aaron Mackay BURNS, Pol PLA I CONESA, William A. SORRENTINO, III, Stephen O. LEMAY, Christopher D. MCKENZIE, Shih-Sang Chiu, Benjamin Hunter Boesel, Jonathan Ravasz
  • Patent number: 11257295
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Grant
    Filed: March 24, 2021
    Date of Patent: February 22, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko