Patents by Inventor Evgenii Krivoruchko

Evgenii Krivoruchko has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250117079
    Abstract: In some embodiments, an electronic device facilitates cursor interactions in different regions in a three-dimensional environment. In some embodiments, an electronic device facilitates cursor interactions in content. In some embodiments, an electronic device facilitates cursor movement. In some embodiments, an electronic device facilitates interaction with multiple input devices. In some embodiments, a computer system facilitates cursor movement based on movement of a hand of a user of the computer system and a location of a gaze of the user in the three-dimensional environment. In some embodiments, a computer system facilitates cursor selection and scrolling of content in the three-dimensional environment.
    Type: Application
    Filed: October 15, 2024
    Publication date: April 10, 2025
    Inventors: Shih-Sang CHIU, Christopher D. MCKENZIE, Pol PLA I CONESA, Jonathan RAVASZ, Benjamin H. BOESEL, Evgenii KRIVORUCHKO, Zoey C. TAYLOR
  • Patent number: 12271531
    Abstract: Techniques for controlling and/or moving a cursor, such as by using air gestures, are described.
    Type: Grant
    Filed: January 2, 2024
    Date of Patent: April 8, 2025
    Assignee: Apple Inc.
    Inventors: Evgenii Krivoruchko, Benjamin H. Boesel, Jia Wang
  • Publication number: 20250103133
    Abstract: Techniques and user interfaces for interacting with virtual objects using gaze in an extended reality environment.
    Type: Application
    Filed: August 20, 2024
    Publication date: March 27, 2025
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO
  • Publication number: 20250044911
    Abstract: A computer system displays a first user interface object in a first view of a three-dimensional environment at a first position in the three-dimensional environment and with a first spatial arrangement relative to a respective portion of the user. The computer system detects movement of a viewpoint of the user from a first location to a second location in a physical environment. In accordance with a determination that the movement does not satisfy a threshold amount of movement, the computer system maintains display of the first user interface object at the first position. In accordance with a determination that the movement satisfies the threshold amount of movement, the computer system ceases to display the first user interface object at the first position and displays the first user interface object at a second position having the first spatial arrangement relative to the respective portion of the user.
    Type: Application
    Filed: October 17, 2024
    Publication date: February 6, 2025
    Inventors: Evgenii Krivoruchko, Israel Pastrana Vicente, Stephen O. Lemay, Christopher D. McKenzie, Zoey C. Taylor
  • Publication number: 20240428539
    Abstract: While a first view of a three-dimensional environment is visible, a computer system detects a first input meeting selection criteria. If, when the first input was detected, a user was directing attention to a first portion of the first view that has a spatial relationship to a viewport through which the three-dimensional environment is visible, the computer system displays a user interface object including affordances for accessing functions of the computer system; otherwise, the computer system forgoes displaying the user interface object. While a different view of the three-dimensional environment is visible, the computer system detects a second input meeting the selection criteria. If, when the second input was detected, the user was directing attention to a second portion of the different view that has the same spatial relationship to the viewport, the computer system displays the user interface object; otherwise, the computer system forgoes displaying the user interface object.
    Type: Application
    Filed: May 29, 2024
    Publication date: December 26, 2024
    Inventors: Matan Stauber, Israel Pastrana Vicente, Stephen O. Lemay, William A. Sorrentino, III, Zoey C. Taylor, Kristi E. Bauerly, Daniel M. Golden, Christopher B. Fleizach, Evgenii Krivoruchko, Amy E. DeDonato
  • Publication number: 20240419294
    Abstract: The present disclosure generally relates to methods and user interfaces for positioning a virtual keyboard in a three-dimensional environment, displaying various types of virtual keyboards, for switching between virtual keyboards, and/or displaying a virtual keyboard based on a position of a user.
    Type: Application
    Filed: May 29, 2024
    Publication date: December 19, 2024
    Inventors: Evgenii KRIVORUCHKO, Torsten BECKER, Shuxin YU, Stephen O. LEMAY, Zoey C. TAYLOR, Laura C. MADEYA, Emily K. VAN HAREN, Ting-Yuan WU
  • Publication number: 20240402792
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable multi-mode interactions with elements in a three-dimensional (3D) environment based on cursor movement associated with tracking user hand motion. For example, a process may include presenting an extended reality (XR) environment comprising a virtual element and a cursor. The process may further include obtaining hand data corresponding to 3D movement of a hand in a 3D environment. The process may further include operating in first mode where the 3D motion of the hand is converted to two-dimensional (2D) motion and detecting a 3D user input criteria. In response to the 3D user input criteria a mode of operation is modified to a second mode where the 3D motion of the hand is maintained without conversion to the 2D motion.
    Type: Application
    Filed: May 16, 2024
    Publication date: December 5, 2024
    Inventors: Jack H. Lawrence, Mark A. Ebbole, Evgenii Krivoruchko
  • Publication number: 20240402901
    Abstract: The present disclosure generally relates to methods and user interfaces scrolling content using various inputs and/or changing an appearance of a scrolling indicator.
    Type: Application
    Filed: March 28, 2024
    Publication date: December 5, 2024
    Inventors: Evgenii KRIVORUCHKO, Danielle M. PRICE, Stephen O. LEMAY, Zoey C. TAYLOR, Hugo D. VERWEIJ, Cedric BRAY, Tom ADRIAENSSEN
  • Publication number: 20240361901
    Abstract: In some embodiments, a computer system enables a user to invoke display of transport controls (and/or other controls associated with controlling playback of content) using gaze inputs, gesture inputs, or a combination of these. In some embodiments, in response to detecting a first user input, the computer system displays a first set of controls in a reduced-prominence state (e.g., in a manner that is not unduly distracting to the user), and in response to detecting a second user input, the computer system displays a second set of controls in an increased-prominence state (e.g., in a more visually prominent state). The second set of controls optionally includes more controls than the first set of controls.
    Type: Application
    Filed: January 30, 2024
    Publication date: October 31, 2024
    Inventors: Jonathan RAVASZ, Angel Suet Yan CHEUNG, Ashwin Kumar ASOKA KUMAR SHENOI, Leah M. GUM, Zoey C. TAYLOR, Evgenii KRIVORUCHKO, Christopher D. MCKENZIE, Matan STAUBER, Yonghyun A. KIM, Gregory T. SCOTT, Lucio MORENO RUFO, Fredric R. VINNA, Brian K. SHIRAISHI, So TANAKA
  • Patent number: 12124674
    Abstract: A computer system detects whether the user satisfies attention criteria with respect to a first user interface object displayed in a first view of a three-dimensional environment. In response to detecting that the user does not satisfy the attention criteria with respect to the first user interface object, the computer system displays the first user interface object with a modified appearance. The computer system detects a first movement of a viewpoint of the user relative to a physical environment and detects that the user satisfies the attention criteria with respect to the first user interface object. In response, the computer system displays the first user interface object in a second view of the three-dimensional environment, including displaying the first user interface object with an appearance that emphasizes the first user interface object more than when the first user interface object was displayed with the modified appearance.
    Type: Grant
    Filed: September 19, 2022
    Date of Patent: October 22, 2024
    Assignee: Apple Inc.
    Inventors: Evgenii Krivoruchko, Israel Pastrana Vicente, Stephen O. Lemay, Christopher D. McKenzie, Zoey C. Taylor
  • Publication number: 20240281108
    Abstract: In some embodiments, a computer system displays a user interface object in a three-dimensional environment based on a change in spatial arrangement of a first portion of a user relative to the user interface object. In some embodiments, a computer system displays a user interface object in a three-dimensional environment in response to detecting a change in a spatial arrangement of a viewpoint of a user relative to the three-dimensional environment.
    Type: Application
    Filed: January 24, 2024
    Publication date: August 22, 2024
    Inventors: Evgenii KRIVORUCHKO, Wesley M. HOLDER, Matan STAUBER, Stephen O. LEMAY, Zoey C. TAYLOR, Christopher D. MCKENZIE
  • Publication number: 20240272782
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: August 15, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Danielle M. PRICE, Jonathan R. DASCOLA, Kristi E. BAUERLY, Marcos ALONSO, Hugo D. VERWEIJ, Lorena S. PAZMINO, Jonathan RAVASZ, Zoey C. Taylor, Miquel ESTANY RODRIGUEZ, James J. Owen
  • Publication number: 20240256049
    Abstract: Techniques for controlling and/or moving a cursor, such as by using air gestures, are described.
    Type: Application
    Filed: January 2, 2024
    Publication date: August 1, 2024
    Inventors: Evgenii KRIVORUCHKO, Benjamin H. BOESEL, Jia WANG
  • Publication number: 20240248678
    Abstract: An example process includes: receiving a first input corresponding to a request to initiate a digital assistant; and in response to receiving the first input, initiating a first instance of a digital assistant session, including: in accordance with a determination that a set of display criteria is satisfied, displaying a digital assistant indicator at a first location in an extended reality (XR) environment; and in accordance with a determination that the set of display criteria is not satisfied, displaying the digital assistant indicator at a second location in the XR environment, wherein the second location is different from the first location.
    Type: Application
    Filed: November 6, 2023
    Publication date: July 25, 2024
    Inventors: Evgenii KRIVORUCHKO, Jay MOON, Lynn I. STREJA, Garrett L. WEINBERG, Pedro MARI
  • Publication number: 20240201493
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Application
    Filed: February 26, 2024
    Publication date: June 20, 2024
    Inventors: Jonathan RAVASZ, Etienne PINCHON, Adam Tibor VARGA, Jasper STEVENS, Robert ELLIS, Jonah JONES, Evgenii KRIVORUCHKO
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Patent number: 11947111
    Abstract: The present technology relates to artificial reality systems. Such systems provide projections a user can create to specify object interactions. For example, when a user wishes to interact with an object outside her immediate reach, she can use a projection to select, move, or otherwise interact with the distant object. The present technology also includes object selection techniques for identifying and disambiguating between objects, allowing a user to select objects both near and distant from the user. Yet further aspects of the present technology include techniques for interpreting various bimanual (two-handed) gestures for interacting with objects. The present technology further includes a model for differentiating between global and local modes for, e.g., providing different input modalities or interpretations of user gestures.
    Type: Grant
    Filed: September 2, 2022
    Date of Patent: April 2, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Jonathan Ravasz, Etienne Pinchon, Adam Tibor Varga, Jasper Stevens, Robert Ellis, Jonah Jones, Evgenii Krivoruchko
  • Publication number: 20240103676
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Danielle M. PRICE, Evgenii KRIVORUCHKO, Jonathan R. DASCOLA, Jonathan RAVASZ, Marcos ALONSO, Hugo D. VERWEIJ, Zoey C. TAYLOR, Lorena S. PAZMINO
  • Publication number: 20240103701
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Lorena S. PAZMINO
  • Publication number: 20240103803
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN, Jose Antonio CHECA OLORIZ, Jay MOON, Pedro MARI, Lorena S. PAZMINO