Patents by Inventor Ieyuki Kawashima

Ieyuki Kawashima has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12271658
    Abstract: An example process includes: displaying, on a display of an electronic device, an extended reality (XR) environment corresponding to a copresence session including the electronic device and a second electronic device; while displaying the XR environment: sampling, with a microphone of the electronic device, a first audio input; determining whether the first audio input is intended for a first digital assistant operating on an external electronic device; and in accordance with a determination that the first audio input is intended for the first digital assistant: causing the first digital assistant to provide an audible response to the first audio input, where the audible response is not transmitted to the second electronic device over a shared communication channel for the copresence session.
    Type: Grant
    Filed: February 23, 2022
    Date of Patent: April 8, 2025
    Assignee: Apple Inc.
    Inventors: Jessica J. Peck, James N. Jones, Ieyuki Kawashima, Lynn I. Streja, Stephen O. Lemay
  • Patent number: 12265364
    Abstract: An electronic device, with a display, a touch-sensitive surface, one or more processors and memory, displays a first representation of a first controllable external device, where the first controllable external device is situated at a location. The device detects a first user input corresponding to a selection of the first representation of the first controllable external device. The device, after detecting the first user input, adds data identifying the first controllable external device and a first state of the first controllable external device in a scene profile.
    Type: Grant
    Filed: March 26, 2020
    Date of Patent: April 1, 2025
    Assignee: Apple Inc.
    Inventors: Patrick L. Coffman, Arian Behzadi, Christopher Patrick Foss, Cyrus Daniel Irani, Ieyuki Kawashima, Stephen O. Lemay, Christopher D. Soli, Christopher Wilson
  • Patent number: 12242707
    Abstract: The present disclosure generally relates to selecting and opening applications. An electronic device includes a display and a rotatable input mechanism rotatable around a rotation axis substantially perpendicular to a normal axis that is normal to a face of the display. The device detects a user input, and in response to detecting the user input, displays a first subset of application views of a set of application views. The first subset of application views is displayed along a first dimension of the display substantially perpendicular to both the rotation axis and the normal axis. The device detects a rotation of the rotatable input mechanism, and in response to detecting the rotation, displays a second subset of application views of the set of application views. Displaying the second subset of application views includes moving the set of application views on the display along the first dimension of the display.
    Type: Grant
    Filed: October 16, 2017
    Date of Patent: March 4, 2025
    Assignee: Apple Inc.
    Inventors: Matthew J. Sundstrom, Taylor G. Carrigan, Christopher Patrick Foss, Ieyuki Kawashima, Stephen O. Lemay, Marco Triverio
  • Patent number: 12169395
    Abstract: An electronic device, with a display, a touch-sensitive surface, one or more processors and memory, displays a first representation of a first controllable external device, where the first controllable external device is situated at a location. The device detects a first user input corresponding to a selection of the first representation of the first controllable external device. The device, after detecting the first user input, adds data identifying the first controllable external device and a first state of the first controllable external device in a scene profile.
    Type: Grant
    Filed: November 22, 2022
    Date of Patent: December 17, 2024
    Assignee: Apple Inc.
    Inventors: Patrick L. Coffman, Arian Behzadi, Christopher Patrick Foss, Cyrus Daniel Irani, Ieyuki Kawashima, Stephen O. Lemay, Christopher D. Soli, Christopher Wilson
  • Publication number: 20240411421
    Abstract: A computer system detects an input to invoke a home menu user interface. In response to detecting the input, the computer system displays, via one or more display generation components, the home menu user interface in a three-dimensional environment, including: if a viewpoint of a user in the three-dimensional environment had a first elevation relative to a reference plane in the three-dimensional environment, displaying the home menu user interface at a first height in the three-dimensional environment; and, if the viewpoint of the user in the three-dimensional environment had a second elevation relative to the reference plane in the three-dimensional environment, the second elevation being different from the first elevation, displaying the home menu user interface at a second height in the three-dimensional environment, the second height being different from the first height.
    Type: Application
    Filed: May 14, 2024
    Publication date: December 12, 2024
    Inventors: Israel Pastrana Vicente, Amy E. DeDonato, Marcos Alonso Ruiz, Lee S. Broughton, Richard D. Lyons, William A. Sorrentino, III, Stephen O. Lemay, James J. Owen, Miquel Estany Rodriguez, Jesse Chand, Jonathan R. Dascola, Christian Schnorr, Zoey C. Taylor, Jonathan Ravasz, Harlan B. Haskins, Vinay Chawda, Benjamin H. Boesel, Ieyuki Kawashima, Christopher D. McKenzie, Benjamin Hylak, Nathan Gitter, Nahckjoon Kim, Owen Monsma, Matan Stauber, Danielle M. Price
  • Patent number: 12112009
    Abstract: In some embodiments, an electronic device changes the immersion level of a virtual environment and/or spatial effect in a three-dimensional environment based on the geometry of the physical environment around the device. In some embodiments, an electronic device modifies the virtual environment and/or spatial effect in response to detecting a movement of the device. In some embodiments, an electronic device moves a user interface of an application into and/or out of a virtual environment. In some embodiments, an electronic device selectively changes the display of a simulated environment and/or atmospheric effect in a three-dimensional environment based on movement of an object associated with a viewpoint of a user. In some embodiments, an electronic device provides feedback to a user in response to a user moving a virtual object to and/or into a simulated environment.
    Type: Grant
    Filed: April 13, 2022
    Date of Patent: October 8, 2024
    Assignee: Apple Inc.
    Inventors: James M. Dessero, Benjamin Hylak, William A. Sorrentino, III, Stephen O. Lemay, Ieyuki Kawashima, Katherine W. Kolombatovich, Jeffrey S. Allen
  • Publication number: 20240319959
    Abstract: An example process includes: displaying, on a display of an electronic device, an extended reality (XR) environment corresponding to a copresence session including the electronic device and a second electronic device; while displaying the XR environment: sampling, with a microphone of the electronic device, a first audio input; determining whether the first audio input is intended for a first digital assistant operating on an external electronic device; and in accordance with a determination that the first audio input is intended for the first digital assistant: causing the first digital assistant to provide an audible response to the first audio input, where the audible response is not transmitted to the second electronic device over a shared communication channel for the copresence session.
    Type: Application
    Filed: February 23, 2022
    Publication date: September 26, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA, Stephen O. LEMAY
  • Publication number: 20240310971
    Abstract: In some embodiments, an electronic device emphasizes and/or deemphasizes user interfaces based on the gaze of a user. In some embodiments, an electronic device defines levels of immersion for different user interfaces independently of one another. In some embodiments, an electronic device resumes display of a user interface at a previously-displayed level of immersion after (e.g., temporarily) reducing the level of immersion associated with the user interface. In some embodiments, an electronic device allows objects, people, and/or portions of an environment to be visible through a user interface displayed by the electronic device. In some embodiments, an electronic device reduces the level of immersion associated with a user interface based on characteristics of the electronic device and/or physical environment of the electronic device.
    Type: Application
    Filed: May 22, 2024
    Publication date: September 19, 2024
    Inventors: Ieyuki KAWASHIMA, Stephen O. LEMAY, William A. SORRENTINO, III, Jeffrey M. FAULKNER, Israel PASTRANA VICENTE, Gary Ian BUTCHER, Kristi E. BAUERLY, Shih-Sang CHIU, Benjamin Hunter BOESEL, David H. HUANG, Dorian D. DARGAN
  • Patent number: 12045437
    Abstract: An example process includes: while displaying a user interface different from a digital assistant user interface, receiving a user input; in accordance with a determination that the user input satisfies a criterion for initiating a digital assistant: displaying, over the user interface, the digital assistant user interface, the digital assistant user interface including: a digital assistant indicator displayed at a first portion of the display; and a response affordance displayed at a second portion of the display, where: a portion of the user interface remains visible at a third portion of the display; and the third portion is between the first portion and the second portion.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: July 23, 2024
    Assignee: Apple Inc.
    Inventors: Neal Ellis, Oluwatomiwa B. Alabi, Robert Burton, Richard R. Dellinger, Thomas R. Fazio, Craig M. Federighi, Rebecca P. Fish, Nikrouz Ghotbi, James N. Jones, Ieyuki Kawashima, Stephen O. Lemay, Pedro Mari, Aaron Musengo, James E. Palmer, Patchaya Beam Seilaudom, Andrea Valentina Simes, Tyler McClay Smith, Trungtin Tran, Robert A. Walker, II
  • Publication number: 20240241627
    Abstract: An electronic device with a touch-sensitive display, one or more processors, and memory detects a first user input. In response to detecting the first user input, the device displays on the touch-sensitive display a user interface screen including a first affordance and a second affordance. The device detects a second user input including a contact on the touch-sensitive display. In accordance with a determination that the contact corresponds to selection of the first affordance, the electronic device is caused to turn off. In accordance with a determination that the contact corresponds to selection of the second affordance, the device causes initiation of a telephone call to a determined number.
    Type: Application
    Filed: March 29, 2024
    Publication date: July 18, 2024
    Inventors: Aled Hywel WILLIAMS, Jonathan P. IVE, Bronwyn JONES, Ieyuki KAWASHIMA, Kevin LYNCH, Natalia MARIC, Andreas E. SCHOBEL, Molly Pray WIEBE
  • Publication number: 20240231558
    Abstract: An example process includes while displaying, on a display, an extended reality (XR) environment: receiving a user input; sampling, with a microphone, a user speech input; in accordance with a determination that the user input satisfies a criterion for initiating a digital assistant, initiating the digital assistant, including: displaying, within the XR environment, a digital assistant indicator at a first location of the XR environment; and while displaying the digital assistant indicator at the first location, providing, by the digital assistant, a response to the user speech input; after providing the response, ceasing to display the digital assistant indicator at the first location; and in accordance with ceasing to display the digital assistant indicator at the first location, displaying the digital assistant indicator at a second location of the XR environment, the second location corresponding to a physical location of an external electronic device implementing a second digital assistant.
    Type: Application
    Filed: February 22, 2022
    Publication date: July 11, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA, Stephen O. LEMAY, William A. SORRENTINO
  • Publication number: 20240211108
    Abstract: An example process includes: at an electronic device with one or more processors, a memory, and a display: receiving a natural language input; initiating the digital assistant; in accordance with initiating the digital assistant, obtaining a response package responsive to the natural language input; after receiving the natural language input, selecting, based on context information associated with the electronic device, a first response mode of the digital assistant from a plurality of digital assistant response modes; and in response to selecting the first response mode, presenting, by the digital assistant, the response package according to the first response mode.
    Type: Application
    Filed: March 6, 2024
    Publication date: June 27, 2024
    Inventors: Neal ELLIS, Rebecca P. FISH, James E. PALMER, Patchaya Beam SEILAUDOM, Richard R. DELLINGER, Pedro MARI, Ieyuki KAWASHIMA, Andrew P. TENNANT, Peter BOHAC, Ping Fang ZHAO, David PADAWER, Tyler C. LEPPEK, Laurie J. VERTELNEY, Shweta D. GRAMPUROHIT
  • Publication number: 20240203066
    Abstract: In some embodiments, a computer system displays virtual content illustrating an area of likely interaction, and displays immersive virtual content. In some embodiments, a computer system reduces visual prominence of immersive virtual content and displays an area of likely interaction. In some embodiments, a computer system generates alerts associated with physical objects in an environment of a user. In some embodiments, a computer system changes the visual prominence of people in a three-dimensional environment based on one or more attention-related factors.
    Type: Application
    Filed: September 23, 2023
    Publication date: June 20, 2024
    Inventors: Ieyuki KAWASHIMA, Benjamin HYLAK, Garrett L. WEINBERG, Joshua R. FORD, Maxime UZAN, Pavel V. DUDRENOV, Katherine W. KOLOMBATOVICH, Giancarlo YERKES, Lee S. BROUGHTON
  • Patent number: 11995285
    Abstract: In some embodiments, an electronic device emphasizes and/or deemphasizes user interfaces based on the gaze of a user. In some embodiments, an electronic device defines levels of immersion for different user interfaces independently of one another. In some embodiments, an electronic device resumes display of a user interface at a previously-displayed level of immersion after (e.g., temporarily) reducing the level of immersion associated with the user interface. In some embodiments, an electronic device allows objects, people, and/or portions of an environment to be visible through a user interface displayed by the electronic device. In some embodiments, an electronic device reduces the level of immersion associated with a user interface based on characteristics of the electronic device and/or physical environment of the electronic device.
    Type: Grant
    Filed: September 15, 2022
    Date of Patent: May 28, 2024
    Assignee: Apple Inc.
    Inventors: Nicholas W. Henderson, Ieyuki Kawashima, Stephen O. Lemay, Israel Pastrana Vicente, Wesley M. Holder, Jeffrey M. Faulkner, William A. Sorrentino, III, Peter D. Anton
  • Publication number: 20240134492
    Abstract: An example process includes while displaying, on a display, an extended reality (XR) environment: receiving a user input; sampling, with a microphone, a user speech input; in accordance with a determination that the user input satisfies a criterion for initiating a digital assistant, initiating the digital assistant, including: displaying, within the XR environment, a digital assistant indicator at a first location of the XR environment; and while displaying the digital assistant indicator at the first location, providing, by the digital assistant, a response to the user speech input; after providing the response, ceasing to display the digital assistant indicator at the first location; and in accordance with ceasing to display the digital assistant indicator at the first location, displaying the digital assistant indicator at a second location of the XR environment, the second location corresponding to a physical location of an external electronic device implementing a second digital assistant.
    Type: Application
    Filed: February 22, 2022
    Publication date: April 25, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA
  • Patent number: 11947784
    Abstract: An electronic device with a touch-sensitive display, one or more processors, and memory detects a first user input. In response to detecting the first user input, the device displays on the touch-sensitive display a user interface screen including a first affordance and a second affordance. The device detects a second user input including a contact on the touch-sensitive display. In accordance with a determination that the contact corresponds to selection of the first affordance, the electronic device is caused to turn off. In accordance with a determination that the contact corresponds to selection of the second affordance, the device causes initiation of a telephone call to a determined number.
    Type: Grant
    Filed: June 16, 2021
    Date of Patent: April 2, 2024
    Assignee: Apple Inc.
    Inventors: Aled Hywel Williams, Jonathan P. Ive, Bronwyn Jones, Ieyuki Kawashima, Kevin Lynch, Natalia Maric, Andreas E. Schobel, Molly Pray Wiebe
  • Publication number: 20230082492
    Abstract: An electronic device, with a display, a touch-sensitive surface, one or more processors and memory, displays a first representation of a first controllable external device, where the first controllable external device is situated at a location. The device detects a first user input corresponding to a selection of the first representation of the first controllable external device. The device, after detecting the first user input, adds data identifying the first controllable external device and a first state of the first controllable external device in a scene profile.
    Type: Application
    Filed: November 22, 2022
    Publication date: March 16, 2023
    Inventors: Patrick L. COFFMAN, Arian BEHZADI, Christopher Patrick FOSS, Cyrus Daniel IRANI, Ieyuki KAWASHIMA, Stephen O. LEMAY, Christopher D. SOLI, Christopher WILSON
  • Patent number: D989809
    Type: Grant
    Filed: July 19, 2021
    Date of Patent: June 20, 2023
    Assignee: Apple Inc.
    Inventors: Gary Butcher, Joseph Chan, Bronwyn Anne Jones, Ieyuki Kawashima, Stephen O. Lemay, Aled Williams
  • Patent number: D1035691
    Type: Grant
    Filed: February 17, 2023
    Date of Patent: July 16, 2024
    Assignee: Apple Inc.
    Inventors: Gorm Halfdan Amand, Daniel Joseph Billett, Joseph Chan, Elizabeth Caroline Cranfill, James Nicholas Jones, Ieyuki Kawashima, Vincent M. Lane, Chia Yang Lin, Cecilia S. Zhou
  • Patent number: D1059397
    Type: Grant
    Filed: February 7, 2024
    Date of Patent: January 28, 2025
    Assignee: Apple Inc.
    Inventors: Gorm Halfdan Amand, Daniel Joseph Billett, Joseph Chan, Elizabeth Caroline Cranfill, James Nicholas Jones, Ieyuki Kawashima, Vincent M. Lane, Chia Yang Lin, Cecilia S. Zhou