Patents by Inventor Justin D. Stoyles

Justin D. Stoyles has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230393796
    Abstract: In some exemplary processes for controlling an external device using a computer-generated reality interface, a view of a physical environment that includes the external device is provided through the display, and information specifying a function of the external device is received from the external device. While the view of the physical environment is being provided, an affordance corresponding to the function is displayed at a position on the display that overlays at least a portion of the external device.
    Type: Application
    Filed: August 18, 2023
    Publication date: December 7, 2023
    Inventors: Justin D. STOYLES, Michael KUHN
  • Publication number: 20230376261
    Abstract: In an exemplary process for accessing a function of an external device through a computer-generated reality interface, one or more external devices are detected. Image data of a physical environment captured by an image sensor is obtained. The process determines whether the image data includes a representation of a first external device of the one or more detected external devices. In accordance with determining that the image data includes a representation of the first external device, the process causes a display to concurrently display a representation of the physical environment according to the image data, and an affordance corresponding to a function of the first external device, wherein detecting user activation of the displayed affordance causes the first external device to perform an action corresponding to the function.
    Type: Application
    Filed: August 1, 2023
    Publication date: November 23, 2023
    Inventors: Justin D. STOYLES, Michael KUHN
  • Patent number: 11762619
    Abstract: In some exemplary processes for controlling an external device using a computer-generated reality interface, information specifying a function of the external device is received from the external device. First image data of a physical environment that includes the external device is obtained with one or more image sensors. A representation of the physical environment according to the first image data is displayed on a display. While displaying the representation of the physical environment, second image data identifying a gesture occurring between the display and the external device in the physical environment is obtained with the one or more image sensors. A determination is made as to whether the identified gesture satisfies one or more predetermined criteria associated with the function. In accordance with determining that the identified gesture satisfies one or more predetermined criteria associated with the function, the external device is caused to perform the function.
    Type: Grant
    Filed: August 4, 2021
    Date of Patent: September 19, 2023
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Patent number: 11762620
    Abstract: In an exemplary process for accessing a function of an external device through a computer-generated reality interface, one or more external devices are detected. Image data of a physical environment captured by an image sensor is obtained. The process determines whether the image data includes a representation of a first external device of the one or more detected external devices. In accordance with determining that the image data includes a representation of the first external device, the process causing a display to concurrently display a representation of the physical environment according to the image data, and an affordance corresponding to a function of the first external device, wherein detecting user activation of the displayed affordance causes the first external device to perform an action corresponding to the function.
    Type: Grant
    Filed: November 23, 2021
    Date of Patent: September 19, 2023
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Patent number: 11302086
    Abstract: The present disclosure relates to providing a software feature of an electronic product in an augmented reality (AR) environment. In some embodiments, images are obtained using one or more image sensors, a determination is made whether the obtained images include printed media depicting the electronic product, when the obtained images include the printed media depicting the electronic product, a virtual object corresponding to the electronic product is displayed in the AR environment, and the software feature of the electronic product is provided with the virtual object.
    Type: Grant
    Filed: January 5, 2021
    Date of Patent: April 12, 2022
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Publication number: 20220083303
    Abstract: In an exemplary process for accessing a function of an external device through a computer-generated reality interface, one or more external devices are detected. Image data of a physical environment captured by an image sensor is obtained. The process determines whether the image data includes a representation of a first external device of the one or more detected external devices. In accordance with determining that the image data includes a representation of the first external device, the process causing a display to concurrently display a representation of the physical environment according to the image data, and an affordance corresponding to a function of the first external device, wherein detecting user activation of the displayed affordance causes the first external device to perform an action corresponding to the function.
    Type: Application
    Filed: November 23, 2021
    Publication date: March 17, 2022
    Inventors: Justin D. STOYLES, Michael KUHN
  • Patent number: 11227494
    Abstract: The present disclosure relates to providing transit information in an augmented reality environment. In some embodiments, images are obtained using one or more image sensors, a determination is made whether the obtained images include a map, and, in accordance with a set of one or more conditions being satisfied, transit information is displayed in the augmented reality environment. A location of the displayed transit information in the augmented reality environment may correspond to a respective feature of the map.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: January 18, 2022
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Patent number: 11188286
    Abstract: In an exemplary process for accessing a function of an external device through a computer-generated reality interface, one or more external devices are detected. Image data of a physical environment captured by an image sensor is obtained. The process determines whether the image data includes a representation of a first external device of the one or more detected external devices. In accordance with determining that the image data includes a representation of the first external device, the process causing a display to concurrently display a representation of the physical environment according to the image data, and an affordance corresponding to a function of the first external device, wherein detecting user activation of the displayed affordance causes the first external device to perform an action corresponding to the function.
    Type: Grant
    Filed: February 26, 2020
    Date of Patent: November 30, 2021
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Publication number: 20210365228
    Abstract: In some exemplary processes for controlling an external device using a computer-generated reality interface, information specifying a function of the external device is received from the external device. First image data of a physical environment that includes the external device is obtained with one or more image sensors. A representation of the physical environment according to the first image data is displayed on a display. While displaying the representation of the physical environment, second image data identifying a gesture occurring between the display and the external device in the physical environment is obtained with the one or more image sensors. A determination is made as to whether the identified gesture satisfies one or more predetermined criteria associated with the function. In accordance with determining that the identified gesture satisfies one or more predetermined criteria associated with the function, the external device is caused to perform the function.
    Type: Application
    Filed: August 4, 2021
    Publication date: November 25, 2021
    Inventors: Justin D. STOYLES, Michael KUHN
  • Patent number: 11120600
    Abstract: Systems and methods for generating a video of an emoji that has been puppeted using inputs from image, depth, and audio. The inputs can capture facial expressions of a user, eye, eyebrow, mouth, and head movements. A pose, held by the user, can be detected that can be used to generate supplemental animation. The emoji can further be animated using physical properties associated with the emoji and captured movements. An emoji of a dog can have its ears move in response to an up-and-down movement, or a shaking of the head. The video can be sent in a message to one or more recipients. A sending device can render the puppeted video in accordance with hardware and software capabilities of a recipient's computer device.
    Type: Grant
    Filed: February 14, 2019
    Date of Patent: September 14, 2021
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Alexandre R. Moha, Nicolas V. Scapel, Guillaume P. Barlier, Aurelio Guzman, Bruno M. Sommer, Nina Damasky, Thibaut Weise, Thomas Goossens, Hoan Pham, Brian Amberg
  • Patent number: 11086581
    Abstract: In some exemplary processes for controlling an external device using a computer-generated reality interface, information specifying a function of the external device is received from the external device. First image data of a physical environment that includes the external device is obtained with one or more image sensors. A representation of the physical environment according to the first image data is displayed on a display. While displaying the representation of the physical environment, second image data identifying a gesture occurring between the display and the external device in the physical environment is obtained with the one or more image sensors. A determination is made as to whether the identified gesture satisfies one or more predetermined criteria associated with the function. In accordance with determining that the identified gesture satisfies one or more predetermined criteria associated with the function, the external device is caused to perform the function.
    Type: Grant
    Filed: February 26, 2020
    Date of Patent: August 10, 2021
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Patent number: 11087559
    Abstract: The present disclosure relates to managing augmented reality content created on a first electronic device and viewed at a second electronic device. In some embodiments, the first electronic device determines its physical location, receives input representing user-generated augmented reality content, displays an augmented reality environment including the user-generated augmented reality content overlaid on a live view of the physical location, and sends the user-generated augmented reality content to an external storage repository. The second electronic device can then receive the user-generated augmented reality content, determine whether it is at the physical location, and display the user-generated augmented reality content when it is at the physical location.
    Type: Grant
    Filed: November 3, 2020
    Date of Patent: August 10, 2021
    Assignee: Apple Inc.
    Inventors: Michael Kuhn, Justin D. Stoyles
  • Patent number: 11087558
    Abstract: The present disclosure relates to managing augmented reality content created on a first electronic device and viewed at a second electronic device. In some embodiments, the first electronic device determines its physical location, receives input representing user-generated augmented reality content, displays an augmented reality environment including the user-generated augmented reality content overlaid on a live view of the physical location, and sends the user-generated augmented reality content to an external storage repository. The second electronic device can then receive the user-generated augmented reality content, determine whether it is at the physical location, and display the user-generated augmented reality content when it is at the physical location.
    Type: Grant
    Filed: September 21, 2020
    Date of Patent: August 10, 2021
    Assignee: Apple Inc.
    Inventors: Michael Kuhn, Justin D. Stoyles
  • Patent number: 10891800
    Abstract: The present disclosure relates to providing a software feature of an electronic product in an augmented reality (AR) environment. In some embodiments, images are obtained using one or more image sensors, a determination is made whether the obtained images include printed media depicting the electronic product, when the obtained images include the printed media depicting the electronic product, a virtual object corresponding to the electronic product is displayed in the AR environment, and the software feature of the electronic product is provided with the virtual object.
    Type: Grant
    Filed: June 10, 2020
    Date of Patent: January 12, 2021
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Michael Kuhn
  • Patent number: 10820795
    Abstract: In one implementation, a method includes: determining an interpupillary distance (IPD) measurement for a user based on a function of depth data obtained by the depth sensor and image data obtained by the image sensor; and calibrating a head-mounted device (HMD) provided to deliver augmented reality/virtual reality (AR/VR) content by setting one or more presentation parameters of the HMD based on the IPD measurement in order to tailor one or more AR/VR displays of the HMD to a field-of-view of the user.
    Type: Grant
    Filed: June 22, 2018
    Date of Patent: November 3, 2020
    Assignee: APPLE INC.
    Inventors: Thibaut Weise, Justin D. Stoyles, Michael Kuhn, Reinhard Klapfer, Stefan Misslinger
  • Publication number: 20200201444
    Abstract: In some exemplary processes for controlling an external device using a computer-generated reality interface, information specifying a function of the external device is received from the external device. First image data of a physical environment that includes the external device is obtained with one or more image sensors. A representation of the physical environment according to the first image data is displayed on a display. While displaying the representation of the physical environment, second image data identifying a gesture occurring between the display and the external device in the physical environment is obtained with the one or more image sensors. A determination is made as to whether the identified gesture satisfies one or more predetermined criteria associated with the function. In accordance with determining that the identified gesture satisfies one or more predetermined criteria associated with the function, the external device is caused to perform the function.
    Type: Application
    Filed: February 26, 2020
    Publication date: June 25, 2020
    Inventors: Justin D. STOYLES, Michael KUHN
  • Publication number: 20200192622
    Abstract: In an exemplary process for accessing a function of an external device through a computer-generated reality interface, one or more external devices are detected. Image data of a physical environment captured by an image sensor is obtained. The process determines whether the image data includes a representation of a first external device of the one or more detected external devices. In accordance with determining that the image data includes a representation of the first external device, the process causing a display to concurrently display a representation of the physical environment according to the image data, and an affordance corresponding to a function of the first external device, wherein detecting user activation of the displayed affordance causes the first external device to perform an action corresponding to the function.
    Type: Application
    Filed: February 26, 2020
    Publication date: June 18, 2020
    Inventors: Justin D. STOYLES, Michael KUHN
  • Publication number: 20190251728
    Abstract: Systems and methods for generating a video of an emoji that has been puppeted using inputs from image, depth, and audio. The inputs can capture facial expressions of a user, eye, eyebrow, mouth, and head movements. A pose, held by the user, can be detected that can be used to generate supplemental animation. The emoji can further be animated using physical properties associated with the emoji and captured movements. An emoji of a dog can have its ears move in response to an up-and-down movement, or a shaking of the head. The video can be sent in a message to one or more recipients. A sending device can render the puppeted video in accordance with hardware and software capabilities of a recipient's computer device.
    Type: Application
    Filed: February 14, 2019
    Publication date: August 15, 2019
    Inventors: Justin D. STOYLES, Alexandre R. MOHA, Nicolas V. SCAPEL, Guillaume P. BARLIER, Aurelio GUZMAN, Bruno M. SOMMER, Nina DAMASKY, Thibaut WEISE, Thomas GOOSSENS, Hoan PHAM, Brian AMBERG
  • Patent number: 10210648
    Abstract: Systems and methods for generating a video of an emoji that has been puppeted using inputs from image, depth, and audio. The inputs can capture facial expressions of a user, eye, eyebrow, mouth, and head movements. A pose, held by the user, can be detected that can be used to generate supplemental animation. The emoji can further be animated using physical properties associated with the emoji and captured movements. An emoji of a dog can have its ears move in response to an up-and-down movement, or a shaking of the head. The video can be sent in a message to one or more recipients. A sending device can render the puppeted video in accordance with hardware and software capabilities of a recipient's computer device.
    Type: Grant
    Filed: November 10, 2017
    Date of Patent: February 19, 2019
    Assignee: Apple Inc.
    Inventors: Justin D. Stoyles, Alexandre R. Moha, Nicolas V. Scapel, Guillaume P. Barlier, Aurelio Guzman, Bruno M. Sommer, Nina Damasky, Thibaut Weise, Thomas Goossens, Hoan Pham, Brian Amberg
  • Publication number: 20180336714
    Abstract: Systems and methods for generating a video of an emoji that has been puppeted using inputs from image, depth, and audio. The inputs can capture facial expressions of a user, eye, eyebrow, mouth, and head movements. A pose, held by the user, can be detected that can be used to generate supplemental animation. The emoji can further be animated using physical properties associated with the emoji and captured movements. An emoji of a dog can have its ears move in response to an up-and-down movement, or a shaking of the head. The video can be sent in a message to one or more recipients. A sending device can render the puppeted video in accordance with hardware and software capabilities of a recipient's computer device.
    Type: Application
    Filed: November 10, 2017
    Publication date: November 22, 2018
    Inventors: Justin D. Stoyles, Alexandre R. Moha, Nicolas V. Scapel, Guillaume P. Barlier, Aurelio Guzman, Bruno M. Sommer, Nina Damasky, Thibaut Weise, Thomas Goossens, Hoan Pham, Brian Amberg