Patents by Inventor James J. Owen

James J. Owen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12373081
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: June 10, 2024
    Date of Patent: July 29, 2025
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A Cazamias, Alexis H. Palangie, James J. Owen
  • Publication number: 20250173979
    Abstract: A computer system displays a user interface object in a three-dimensional environment with background content behind the user interface object in the three-dimensional environment. A first portion of the user interface object has an appearance based on an appearance of background content behind the first portion of the first user interface object, and has content between the first portion and a front of the first user interface object. When moving the content within the first user interface object in response to a user request, the content remains between the first portion of the first user interface object and the front of the first user interface object, and is displayed with a visual effect that is applied to the content based on a simulated thickness of the first user interface object.
    Type: Application
    Filed: January 14, 2025
    Publication date: May 29, 2025
    Inventors: Miquel Estany Rodriguez, Wan Si Wan, Gregory M. Apodaca, William A. Sorrentino, III, James J. Owen, Pol Pla I. Conesa, Alan C. Dye, Stephen O. Lemay, Giancarlo Yerkes
  • Publication number: 20250118038
    Abstract: In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.
    Type: Application
    Filed: December 19, 2024
    Publication date: April 10, 2025
    Inventors: William A. SORRENTINO, III, Benjamin HYLAK, Christopher D. MCKENZIE, Stephen O. LEMAY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN
  • Patent number: 12236540
    Abstract: A computer system concurrently displays a view of a physical environment; and a computer-generated user interface element overlaid on the view of the physical environment. An appearance of the computer-generated user interface element is based on an appearance of the view of the physical environment on which the computer-generated user interface element is overlaid. In response to an appearance of the physical environment changing, the appearance of the computer-generated user interface element is updated at a first time based on a graphical composition of the appearance of one or more portions of the physical environment at different times prior to the first time, including: an appearance of a first portion of the physical environment at a second time that is before the first time; and an appearance of a second portion of the physical environment at a third time that is before the second time.
    Type: Grant
    Filed: September 21, 2022
    Date of Patent: February 25, 2025
    Assignee: APPLE INC.
    Inventors: Miquel Estany Rodriguez, Wan Si Wan, Gregory M. Apodaca, William A. Sorrentino, III, James J. Owen, Pol Pla I. Conesa, Alan C. Dye
  • Publication number: 20240411421
    Abstract: A computer system detects an input to invoke a home menu user interface. In response to detecting the input, the computer system displays, via one or more display generation components, the home menu user interface in a three-dimensional environment, including: if a viewpoint of a user in the three-dimensional environment had a first elevation relative to a reference plane in the three-dimensional environment, displaying the home menu user interface at a first height in the three-dimensional environment; and, if the viewpoint of the user in the three-dimensional environment had a second elevation relative to the reference plane in the three-dimensional environment, the second elevation being different from the first elevation, displaying the home menu user interface at a second height in the three-dimensional environment, the second height being different from the first height.
    Type: Application
    Filed: May 14, 2024
    Publication date: December 12, 2024
    Inventors: Israel Pastrana Vicente, Amy E. DeDonato, Marcos Alonso Ruiz, Lee S. Broughton, Richard D. Lyons, William A. Sorrentino, III, Stephen O. Lemay, James J. Owen, Miquel Estany Rodriguez, Jesse Chand, Jonathan R. Dascola, Christian Schnorr, Zoey C. Taylor, Jonathan Ravasz, Harlan B. Haskins, Vinay Chawda, Benjamin H. Boesel, Ieyuki Kawashima, Christopher D. McKenzie, Benjamin Hylak, Nathan Gitter, Nahckjoon Kim, Owen Monsma, Matan Stauber, Danielle M. Price
  • Publication number: 20240402889
    Abstract: The present disclosure generally relates to techniques and user interfaces for logging and/or interacting with health information.
    Type: Application
    Filed: May 21, 2024
    Publication date: December 5, 2024
    Inventors: Lindsey MARATTA, Dima BADAWI, Jose Antonio CHECA OLORIZ, Pablo F. CARO, Marie E. DOMMENGET, Bradley W. GRIFFIN, James J. OWEN, Stacie R. TERHAAR
  • Publication number: 20240404189
    Abstract: While a view of a three-dimensional environment is visible, a computer system displays a user interface object with a first orientation in the three-dimensional environment and displays a simulated shadow, corresponding to the user interface object, at a first shadow position in the three-dimensional environment. The simulated shadow at the first shadow position has a first spatial relationship to the user interface object. In response to detecting a user input directed to the user interface object, the computer system changes an orientation of the user interface object from the first orientation to a different, second orientation, including: displaying the user interface object with the second orientation; and displaying the simulated shadow at a second shadow position in the three-dimensional environment, different from the first shadow position, at which the simulated shadow has a second spatial relationship to the user interface object that is different from the first spatial relationship.
    Type: Application
    Filed: May 28, 2024
    Publication date: December 5, 2024
    Inventors: James J. Owen, Miquel Estany Rodriguez, William A. Sorrentino, III
  • Publication number: 20240402870
    Abstract: The present disclosure generally relates to user interfaces for electronic devices, including user interfaces for presenting content.
    Type: Application
    Filed: March 25, 2024
    Publication date: December 5, 2024
    Inventors: Chia Yang LIN, Timothy T. CHONG, Danielle M. PRICE, Israel PASTRANA VICENTE, William A. SORRENTINO, III, Hugo D. VERWEIJ, James J. OWEN, Miquel ESTANY RODRIGUEZ
  • Publication number: 20240403997
    Abstract: An example process includes: while displaying a portion of an extended reality (XR) environment representing a current field of view of a user: detecting, with the one or more sensors, a user input to invoke a digital assistant; in response to detecting a user input to invoke the digital assistant, displaying a user interface element associated with the digital assistant; distorting the display of a first portion of the current field of view behind the user interface element; detecting a change in the position of the user interface element relative to the current field of view; and distorting the display of a second portion of the current field of view behind the user interface element.
    Type: Application
    Filed: March 21, 2024
    Publication date: December 5, 2024
    Inventors: Jose Antonio CHECA OLORIZ, Miquel ESTANY RODRIGUEZ, Arjun KAUL, Pedro MARI, Christopher C. NIEDERAUER, James J. OWEN, William A. SORRENTINO, III
  • Publication number: 20240329797
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: June 10, 2024
    Publication date: October 3, 2024
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20240272782
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: August 15, 2024
    Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Danielle M. PRICE, Jonathan R. DASCOLA, Kristi E. BAUERLY, Marcos ALONSO, Hugo D. VERWEIJ, Lorena S. PAZMINO, Jonathan RAVASZ, Zoey C. Taylor, Miquel ESTANY RODRIGUEZ, James J. Owen
  • Publication number: 20240177424
    Abstract: Systems and processes for operating an intelligent automated assistant within a computer-generated reality (CGR) environment are provided. For example, a user input invoking a digital assistant session is received, and in response, a digital assistant session is initiated. Initiating the digital assistant session includes positioning a digital assistant object at a first location within the CGR environment but outside of the currently-displayed portion of the CGR environment at a first time, and providing a first output indicating the location of the digital assistant object.
    Type: Application
    Filed: February 6, 2024
    Publication date: May 30, 2024
    Inventors: Brad K. HERMAN, Garrett L. WEINBERG, Isar ARASON, Pedro MARI, Shiraz AKMAL, Stephen O. LEMAY, James J. OWEN, Miquel ESTANY RODRIGUEZ, Jay MOON, William A. SORRENTINO, III, Jose Antonio CHECA OLORIZ, Lynn I. STREJA
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Publication number: 20240103803
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN, Jose Antonio CHECA OLORIZ, Jay MOON, Pedro MARI, Lorena S. PAZMINO
  • Publication number: 20240087256
    Abstract: In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by changing visual properties of one or more portions of the virtual object. In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by displaying the virtual object in a virtual environment within the three-dimensional environment. In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by variably changing visual properties of one or more portions of the virtual object and/or by variably displaying the virtual object in a virtual environment based on one or more characteristics of the depth conflict.
    Type: Application
    Filed: September 14, 2023
    Publication date: March 14, 2024
    Inventors: Benjamin HYLAK, William A. SORRENTINO, III, Christopher D. MCKENZIE, James J. OWEN, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ
  • Patent number: D1068805
    Type: Grant
    Filed: June 4, 2023
    Date of Patent: April 1, 2025
    Assignee: Apple Inc.
    Inventors: Gregory M. Apodaca, Jesse Chand, Thalia Jimena Echevarria Fiol, Miquel Estany Rodriguez, Stephen O. Lemay, Richard D. Lyons, James J. Owen, Lorena S. Pazmino, William A. Sorrentino, III, Matan Stauber, Wan Si Wan
  • Patent number: D1068810
    Type: Grant
    Filed: June 4, 2023
    Date of Patent: April 1, 2025
    Assignee: Apple Inc.
    Inventors: Gregory M. Apodaca, Jesse Chand, Jonathan R. Dascola, Thalia Jimena Echevarria Fiol, Miquel Estany Rodriguez, Stephen O. Lemay, Richard D. Lyons, James J. Owen, Israel Pastrana Vicente, Lorena S. Pazmino, William A. Sorrentino, III, Matan Stauber, Wan Si Wan
  • Patent number: D1078761
    Type: Grant
    Filed: June 5, 2023
    Date of Patent: June 10, 2025
    Assignee: Apple Inc.
    Inventors: Jesse Chand, Jonathan R. Dascola, Miquel Estany Rodriguez, Stephen O. Lemay, Richard D. Lyons, James J. Owen, Israel Pastrana Vicente, Lorena S. Pazmino, William A. Sorrentino, III, Matan Stauber
  • Patent number: D1087121
    Type: Grant
    Filed: June 4, 2023
    Date of Patent: August 5, 2025
    Assignee: Apple, Inc.
    Inventors: Gregory M. Apodaca, Jesse Chand, Miquel Estany Rodriguez, Stephen O. Lemay, James J. Owen, Israel Pastrana Vicente, Lorena S. Pazmino, William A. Sorrentino, III, Matan Stauber, Wan Si Wan
  • Patent number: D1087123
    Type: Grant
    Filed: June 4, 2023
    Date of Patent: August 5, 2025
    Assignee: Apple Inc.
    Inventors: Gregory M. Apodaca, Lee S. Broughton, Jesse Chand, Miquel Estany Rodriguez, Stephen O. Lemay, Richard D. Lyons, James J. Owen, Israel Pastrana Vicente, William A. Sorrentino, III, Matan Stauber, Wan Si Wan