Patents by Inventor Devin W. Chalmers

Devin W. Chalmers has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11714540
    Abstract: The present disclosure relates generally to remote touch detection. In some examples, a first electronic device obtains first image data and second image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the first image data and the second image data. In some examples, a first electronic device causes emission of infrared light by an infrared source of a second electronic device, obtains image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the image data.
    Type: Grant
    Filed: September 22, 2020
    Date of Patent: August 1, 2023
    Assignee: Apple Inc.
    Inventors: Samuel L. Iglesias, Devin W. Chalmers, Rohit Sethi, Lejing Wang
  • Patent number: 11714592
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Grant
    Filed: September 27, 2021
    Date of Patent: August 1, 2023
    Assignee: Apple Inc.
    Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair, Timothy R. Oriol, Alexis H. Palangie
  • Publication number: 20230102686
    Abstract: In one implementation, a method of localizing a device is performed at a device including one or more processors and non-transitory memory. The method includes obtaining an estimate of a pose of the device in an environment. The method includes obtaining an environmental model of the environment including a spatial feature in the environment defined by a first spatial feature location. The method includes determining a second spatial feature location of the spatial feature based on the estimate of the pose of the device. The method includes determining an updated estimate of the pose of the device based on the first spatial feature location and the second spatial feature location.
    Type: Application
    Filed: June 28, 2022
    Publication date: March 30, 2023
    Inventors: Anna L. Brewer, Devin W. Chalmers, Thomas G. Salter
  • Publication number: 20220291743
    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine that a user is interested in audio content by determining that a movement (e.g., a user's head bob) has a time-based relationship with detected audio content (e.g., the beat of music playing in the background). Some implementations involve obtaining first sensor data and second sensor data corresponding to a physical environment, the first sensor data corresponding to audio in the physical environment and the second sensor data corresponding to a body movement in the physical environment. A time-based relationship between one or more elements of the audio and one or more aspects of the body movement is identified based on the first sensor data and the second sensor data. An interest in content of the audio is identified based on identifying the time-based relationship. Various actions may be performed proactively based on identifying the interest in the content.
    Type: Application
    Filed: March 8, 2022
    Publication date: September 15, 2022
    Inventors: Brian W. Temple, Devin W. Chalmers, Thomas G. Salter
  • Patent number: 11297371
    Abstract: A method includes obtaining a video based on images detected with cameras mounted on a vehicle and displaying a portion of the video corresponding to a time offset and a viewing angle.
    Type: Grant
    Filed: April 23, 2019
    Date of Patent: April 5, 2022
    Assignee: Apple Inc.
    Inventors: Devin W. Chalmers, Sean B. Kelly
  • Publication number: 20220100590
    Abstract: An electronic device that is in communication with one or more wearable audio output devices detects occurrence of an event while the one or more wearable audio output devices are being worn by a user. In response to detecting the occurrence of the event, the electronic device outputs, via the one or more wearable audio output devices, one or more audio notifications corresponding to the event, including: in accordance with a determination that the user of the electronic device is currently engaged in a conversation, delaying outputting the one or more audio notifications corresponding to the event until the conversation has ended; and, in accordance with a determination that the user of the electronic device is not currently engaged in a conversation, outputting the one or more audio notifications corresponding to the event without delaying the outputting.
    Type: Application
    Filed: December 13, 2021
    Publication date: March 31, 2022
    Inventors: Devin W. Chalmers, Sean B. Kelly, Karlin Y. Bark
  • Patent number: 11231975
    Abstract: An electronic device is in communication with one or more wearable audio output devices. The electronic device detects occurrence of an event and outputs, via the one or more wearable audio output devices, one or more audio notifications corresponding to the event. After beginning to output the one or more audio notifications, the electronic device detects an input directed to the one or more wearable audio output devices. In response, if the input is detected within a predefined time period with respect to the one or more audio notifications corresponding to the event, the electronic device performs a first operation associated with the one or more audio notifications corresponding to the event; and, if the input is detected after the predefined time period has elapsed, the electronic device performs a second operation not associated with the one or more audio notifications corresponding to the first event.
    Type: Grant
    Filed: September 18, 2019
    Date of Patent: January 25, 2022
    Assignee: APPLE INC.
    Inventors: Devin W. Chalmers, Sean B. Kelly, Karlin Y. Bark
  • Publication number: 20220012002
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Application
    Filed: September 27, 2021
    Publication date: January 13, 2022
    Inventors: Avi BAR-ZEEV, Ryan S. BURGOYNE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Rahul NAIR, Timothy R. ORIOL, Alexis H. PALANGIE
  • Publication number: 20210397407
    Abstract: A method performed by an audio system comprising a headset. The method sends a playback signal containing user-desired audio content to drive a speaker of the headset that is being worn by a user, receives a microphone signal from a microphone that is arranged to capture sounds within an ambient environment in which the user is located, performs a speech detection algorithm upon the microphone signal to detect speech contained therein, in response to a detection of speech, determines that the user intends to engage in a conversation with a person who is located within the ambient environment, and, in response to determining that the user intends to engage in the conversation, adjusts the playback signal based on the user-desired audio content.
    Type: Application
    Filed: May 17, 2021
    Publication date: December 23, 2021
    Inventors: Christopher T. Eubank, Devin W. Chalmers, Kirill Kalinichev, Rahul Nair, Thomas G. Salter
  • Publication number: 20210365116
    Abstract: One exemplary implementation provides an improved user experience on a device by using physiological data to initiate a user interaction for the user experience based on an identified interest or intention of a user. For example, a sensor may obtain physiological data (e.g., pupil diameter) of a user during a user experience in which content is displayed on a display. The physiological data varies over time during the user experience and a pattern is detected. The detected pattern is used to identify an interest of the user in the content or an intention of the user regarding the content. The user interaction is then initiated based on the identified interest or the identified intention.
    Type: Application
    Filed: August 6, 2021
    Publication date: November 25, 2021
    Inventors: Avi Bar-Zeev, Devin W. Chalmers, Fletcher R. Rothkopf, Grant H. Mulliken, Holly E. Gerhard, Lilli I. Jonsson
  • Publication number: 20210325960
    Abstract: Aspects of the subject technology relate to gaze-based control of an electronic device. The gaze-based control can include enabling an option to provide user authorization when it is determined that the user has viewed and/or read text associated with a request for the user authorization. The gaze-based control can also include modifying a user interface or a user interface element based on user views and/or reads. The gaze-based control can be based on determining whether a user has viewed and/or read an electronic document and/or a physical document.
    Type: Application
    Filed: February 26, 2021
    Publication date: October 21, 2021
    Inventors: Samuel L. IGLESIAS, Mark A. EBBOLE, Andrew P. RICHARDSON, Tyler R. CALDERONE, Michael E. BUERLI, Devin W. CHALMERS
  • Patent number: 11137967
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Grant
    Filed: March 24, 2020
    Date of Patent: October 5, 2021
    Assignee: Apple Inc.
    Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair
  • Patent number: 11132162
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Grant
    Filed: March 24, 2020
    Date of Patent: September 28, 2021
    Assignee: Apple Inc.
    Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair, Timothy R. Oriol, Alexis H. Palangie
  • Patent number: 11119573
    Abstract: One exemplary implementation provides an improved user experience on a device by using physiological data to initiate a user interaction for the user experience based on an identified interest or intention of a user. For example, a sensor may obtain physiological data (e.g., pupil diameter) of a user during a user experience in which content is displayed on a display. The physiological data varies over time during the user experience and a pattern is detected. The detected pattern is used to identify an interest of the user in the content or an intention of the user regarding the content. The user interaction is then initiated based on the identified interest or the identified intention.
    Type: Grant
    Filed: September 12, 2019
    Date of Patent: September 14, 2021
    Assignee: Apple Inc.
    Inventors: Avi Bar-Zeev, Devin W. Chalmers, Fletcher R. Rothkopf, Grant H. Mulliken, Holly E. Gerhard, Lilli I. Jonsson
  • Publication number: 20210004133
    Abstract: The present disclosure relates generally to remote touch detection. In some examples, a first electronic device obtains first image data and second image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the first image data and the second image data. In some examples, a first electronic device causes emission of infrared light by an infrared source of a second electronic device, obtains image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the image data.
    Type: Application
    Filed: September 22, 2020
    Publication date: January 7, 2021
    Inventors: Samuel L. IGLESIAS, Devin W. CHALMERS, Rohit SETHI, Lejing WANG
  • Patent number: 10809910
    Abstract: The present disclosure relates generally to remote touch detection. In some examples, a first electronic device obtains first image data and second image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the first image data and the second image data. In some examples, a first electronic device causes emission of infrared light by an infrared source of a second electronic device, obtains image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the image data.
    Type: Grant
    Filed: August 28, 2019
    Date of Patent: October 20, 2020
    Assignee: Apple Inc.
    Inventors: Samuel L. Iglesias, Devin W. Chalmers, Rohit Sethi, Lejing Wang
  • Publication number: 20200225747
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Application
    Filed: March 24, 2020
    Publication date: July 16, 2020
    Inventors: Avi BAR-ZEEV, Ryan S. BURGOYNE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Rahul NAIR
  • Publication number: 20200225746
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Application
    Filed: March 24, 2020
    Publication date: July 16, 2020
    Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair, Timothy R. Oriol, Alexis H. Palangie
  • Publication number: 20200104194
    Abstract: An electronic device is in communication with one or more wearable audio output devices. The electronic device detects occurrence of an event and outputs, via the one or more wearable audio output devices, one or more audio notifications corresponding to the event. After beginning to output the one or more audio notifications, the electronic device detects an input directed to the one or more wearable audio output devices. In response, if the input is detected within a predefined time period with respect to the one or more audio notifications corresponding to the event, the electronic device performs a first operation associated with the one or more audio notifications corresponding to the event; and, if the input is detected after the predefined time period has elapsed, the electronic device performs a second operation not associated with the one or more audio notifications corresponding to the first event.
    Type: Application
    Filed: September 18, 2019
    Publication date: April 2, 2020
    Inventors: Devin W. Chalmers, Sean B. Kelly, Karlin Y. Bark
  • Publication number: 20200104025
    Abstract: The present disclosure relates generally to remote touch detection. In some examples, a first electronic device obtains first image data and second image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the first image data and the second image data. In some examples, a first electronic device causes emission of infrared light by an infrared source of a second electronic device, obtains image data about an input, and performs an operation based on the input in accordance with a determination that a set of one or more criteria is met based on the image data.
    Type: Application
    Filed: August 28, 2019
    Publication date: April 2, 2020
    Inventors: Samuel L. IGLESIAS, Devin W. Chalmers, Rohit Sethi, Lejing Wang