Patents by Inventor Israel Pastrana Vicente

Israel Pastrana Vicente has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11995285
    Abstract: In some embodiments, an electronic device emphasizes and/or deemphasizes user interfaces based on the gaze of a user. In some embodiments, an electronic device defines levels of immersion for different user interfaces independently of one another. In some embodiments, an electronic device resumes display of a user interface at a previously-displayed level of immersion after (e.g., temporarily) reducing the level of immersion associated with the user interface. In some embodiments, an electronic device allows objects, people, and/or portions of an environment to be visible through a user interface displayed by the electronic device. In some embodiments, an electronic device reduces the level of immersion associated with a user interface based on characteristics of the electronic device and/or physical environment of the electronic device.
    Type: Grant
    Filed: September 15, 2022
    Date of Patent: May 28, 2024
    Assignee: Apple Inc.
    Inventors: Nicholas W. Henderson, Ieyuki Kawashima, Stephen O. Lemay, Israel Pastrana Vicente, Wesley M. Holder, Jeffrey M. Faulkner, William A. Sorrentino, III, Peter D. Anton
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Publication number: 20240152244
    Abstract: While displaying an application user interface, a device detects a first input to an input device of the one or more input devices, the input device provided on a housing of the device that includes the one or more display generation components. In response to detecting the first input, the device replaces display of at least a portion of the application user interface by displaying a home menu user interface via the one or more display generation components. While displaying the home menu user interface, the device detects a second input to the input device provided on the housing of the device; and in response to detecting the second input to the input device provided on the housing of the device: the device dismisses the home menu user interface.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Christopher D. McKenzie, Stephen O. Lemay, Zoey C. Taylor, Vitalii Kramar, Benjamin Hylak, Sanket S. Dave, Deepak Iyer, Lauren A. Hastings, Madhur Ahuja, Natalia A. Fornshell, Christopher J. Romney, Joaquim Goncola Lobo Ferreira da Silva, Shawna M. Spain
  • Patent number: 11954242
    Abstract: A computer system presents first computer-generated content. While presenting the first computer-generated content, the computer system detects first movement of a first user in a physical environment, and in response: in accordance with a determination that the first movement changes a spatial relationship between the first user and a second user in the physical environment from a first spatial relationship to a second spatial relationship and that the change in spatial relationship meets first criteria, the computer system changes one or more output properties of the first computer-generated content; and in accordance with the determination that the first movement changes the spatial relationship from the first spatial relationship to the second spatial relationship and that the change in spatial relationship does not meet the first criteria, the computer system presents the first computer-generated content without changing the one or more output properties of the first computer-generated content.
    Type: Grant
    Filed: December 28, 2021
    Date of Patent: April 9, 2024
    Assignee: APPLE INC.
    Inventors: Jonathan R. Dascola, Israel Pastrana Vicente, Peter D. Anton, Stephen O. Lemay, William A. Sorrentino, III, Kristi E. S. Bauerly, Philipp Rockel, Dorian D. Dargan
  • Publication number: 20240103701
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Lorena S. PAZMINO
  • Publication number: 20240103712
    Abstract: A computer system detects a gaze input directed to a region in an environment and, while detecting the gaze input, detects a touch input. In response, the computer system displays a focus indicator at a location corresponding to the region. The computer system detects a continuation of the touch input that includes movement of the touch input along an input surface while being maintained on the input surface. In response, the computer system moves the focus indicator in accordance with the movement of the touch input: within a user interface of an application, if the movement corresponds to a request to move the focus indicator within the user interface; and within the user interface without moving the focus indicator outside of the boundary of the user interface, if the movement corresponds to a request to move the focus indicator outside of a boundary of the user interface.
    Type: Application
    Filed: September 19, 2023
    Publication date: March 28, 2024
    Inventors: Jonathan Ravasz, Israel Pastrana Vicente, Stephen O. Lemay, Kristi E.S. Bauerly, Zoey C. Taylor
  • Publication number: 20240103681
    Abstract: A computer system displays a first user interface object and a first control element with a first appearance that is associated with performing a first operation with respect to the first user interface object, in a first view of a three-dimensional environment. The computer system detects a first gaze input that is directed to the first control element, and in response, updates an appearance of the first control element from to a second appearance that is different from the first appearance. While displaying the first control element with the second appearance, the computer system detects a first user input directed to the first control element, and in accordance with a determination that the first user input meets first criteria, updates the appearance of the first control element from the second appearance to a third appearance that is different from the first appearance and the second appearance.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Jonathan R. Dascola, Stephen O. Lemay, Zoey C. Taylor
  • Publication number: 20240103616
    Abstract: Gaze enrollment, including displaying an enrollment progress user indicator, animating movement of user interface elements, changing the appearances of user interface elements, and/or moving a user interface element over time, enable a computer system to more accurately track the gaze of a user of the computer system
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Giancarlo YERKES, Adam L. AMADIO, Kaely COON, Amy E. DEDONATO, Stephen O. LEMAY, Israel PASTRANA VICENTE, William A. SORRENTINO, III, Lynn I. STREJA
  • Publication number: 20240103676
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Danielle M. PRICE, Evgenii KRIVORUCHKO, Jonathan R. DASCOLA, Jonathan RAVASZ, Marcos ALONSO, Hugo D. VERWEIJ, Zoey C. TAYLOR, Lorena S. PAZMINO
  • Publication number: 20240103617
    Abstract: Gaze enrollment, including displaying an enrollment progress user indicator, animating movement of user interface elements, changing the appearances of user interface elements, and/or moving a user interface element over time, enable a computer system to more accurately track the gaze of a user of the computer system.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Giancarlo YERKES, Adam L. AMADIO, Amy E. DEDONATO, Kirsty KEATCH, Stephen O. LEMAY, Israel PASTRANA VICENTE, Danielle M. PRICE, William A. SORRENTINO, III, Lynn I. STREJA, Hugo D. VERWEIJ, Hana Z. WANG
  • Publication number: 20240103680
    Abstract: A computer system displays a user interface that includes a first region and a second region separated by a third region and displays a focus indicator within the first region. In response to detecting an input to move the focus indicator relative to the user interface, the input being associated with movement toward the second region, if the input meets respective criteria based on the movement associated with the input, the computer system moves the focus indicator from the first region to the second region in accordance with the movement associated with the input without displaying the focus indicator in the third region; and, if the input does not meet the respective criteria, the computer system changes an appearance of the focus indicator in accordance with the movement associated with the input while continuing to display at least a portion of the focus indicator within the first region.
    Type: Application
    Filed: September 19, 2023
    Publication date: March 28, 2024
    Inventors: Jonathan Ravasz, Israel Pastrana Vicente, Stephen O. Lemay, Kristi E.S. Bauerly, Zoey C. Taylor
  • Publication number: 20240104873
    Abstract: A computer system, while displaying, a first object at a first position in a first view of a three-dimensional environment, displays a first set of one or more control objects, wherein a respective control object of the first set of one or more control objects corresponds to a respective operation applicable to the first object. In response to detecting a first user input that corresponds to a request to move the first object in the three-dimensional environment, the computer system: moves the first object from the first position to a second position and, while moving the first object from the first position to the second position, visually deemphasizes relative to the first object, at least one of the first set of one or more control objects that corresponds to the respective operation that is applicable to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Israel Pastrana Vicente, Matan Stauber, Zoey C. Taylor
  • Publication number: 20240103716
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY
  • Publication number: 20240103704
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Zoey C. TAYLOR
  • Publication number: 20240104861
    Abstract: While displaying an application user interface, in response to detecting a first input to an input device a computer system, in accordance with a determination that the application user interface is in a first mode of display, wherein the first mode of display includes an immersive mode in which only content of the application user interface is displayed, displays via the display generation component the application user interface in a second mode of display, wherein the second mode of display includes a non-immersive mode in which respective content of the application user interface and other content are concurrently displayed, and in accordance with a determination that the application user interface is in the second mode of display, the computer system replaces display of at least a portion of the application user interface by displaying a home menu user interface via the display generation component.
    Type: Application
    Filed: September 18, 2023
    Publication date: March 28, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Stephen O. Lemay, Zoey C. Taylor
  • Publication number: 20240103687
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristie E. BAUERLY, Zoey C. TAYLOR
  • Patent number: 11934569
    Abstract: A computer system displays a first and second user interface object in a three-dimensional environment. The first and second user interface objects have a first and second spatial relationship to a first and second anchor position corresponding to a location of a user's hand in a physical environment, respectively. While displaying the first and second user interface objects in the three-dimensional environment, the computer system detects movement of the user's hand in the physical environment, corresponding to a translational movement and a rotational movement of the user's hand relative to a viewpoint, and in response, translates the first and second user interface objects relative to the viewpoint in accordance with the translational movement of the user's hand, and rotates the first user interface object relative to the viewpoint in accordance with the rotational movement of the user's hand without rotating the second user interface object.
    Type: Grant
    Filed: September 19, 2022
    Date of Patent: March 19, 2024
    Assignee: APPLE INC.
    Inventors: Israel Pastrana Vicente, Jonathan R. Dascola, Christopher D. McKenzie, Jesse Chand, Stephen O. Lemay, Kristi E. S. Bauerly, Zoey C. Taylor
  • Publication number: 20240053859
    Abstract: A computer system displays, in a simulated three-dimensional space, an object with a user interface displayed at a pose corresponding to a pose of the object in the simulated space, the object's pose corresponding to a pose of an input device in a physical environment. In response to detecting a movement input via the input device: if the movement input corresponds to input device movement, relative to the physical environment, meeting pose criteria requiring that a parameter of change in the input device pose meet a set of one or more thresholds, the computer system displays the user interface away from the object; and, if the movement input corresponds to input device movement not meeting the pose criteria, the computer system updates the object's pose in the simulated space based on the input device movement, while maintaining display of the user interface at a pose corresponding to the object's pose.
    Type: Application
    Filed: October 26, 2023
    Publication date: February 15, 2024
    Inventors: Jeffrey M. Faulkner, Wesley M. Holder, Giancarlo Yerkes, Israel Pastrana Vicente, William A. Sorrentino, III, Stephen O. Lemay
  • Publication number: 20240028177
    Abstract: The present disclosure generally relates to user interfaces for electronic devices, including user interfaces for viewing and interacting with media items.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 25, 2024
    Inventors: Israel PASTRANA VICENTE, Benjamin H. BOESEL, Shih-Sang CHIU, Graham R. CLARKE, Miquel ESTANY RODRIGUEZ, Chia Yang LIN, James J. OWEN, Jonathan RAVASZ, William A. SORRENTINO, III
  • Patent number: 11861136
    Abstract: A computer system displays a view of at least a portion of a simulated three-dimensional space, and a view of a user interface object located within the simulated three-dimensional space. The user interface object is a representation of a computing device that has a non-immersive display environment that provides access to a plurality of different applications. The user interface object includes a first user interface that corresponds to the non-immersive display environment of the computing device and is responsive to touch inputs from a user on the input device, and a pose of the user interface object in the simulated three-dimensional space corresponds to a pose of the input device in a physical space surrounding the input device. In response to a touch input that corresponds to a respective location in the first user interface, an appearance of the first user interface is updated.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: January 2, 2024
    Assignee: APPLE INC.
    Inventors: Jeffrey M. Faulkner, Wesley M. Holder, Giancarlo Yerkes, Israel Pastrana Vicente, William A. Sorrentino, III, Stephen O. Lemay