Patents by Inventor Aaron Mackay Burns

Aaron Mackay Burns has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240112649
    Abstract: Exemplary processes are described, including processes to move and/or resize user interface elements in a computer-generated reality environment.
    Type: Application
    Filed: December 14, 2023
    Publication date: April 4, 2024
    Inventors: Aaron Mackay BURNS, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20240062487
    Abstract: In an exemplary process, a computer-generated reality environment comprising a virtual object is presented and user movement that occurs in a physical environment is detected. In response to determining that the detected user movement is toward the virtual object and that the virtual object obstructs a real object from the physical environment, a determination is made whether the detected user movement is directed to the virtual object or the real object. In accordance with a determination that the detected user movement is directed to the real object, a visual appearance of the virtual object is modified, where modifying the visual appearance of the virtual object comprises displaying presenting at least a portion of the real object. In accordance with a determination that the detected user movement is directed to the virtual object, the presentation of the virtual object is maintained to obstruct the real object.
    Type: Application
    Filed: October 31, 2023
    Publication date: February 22, 2024
    Inventors: Alexis PALANGIE, Aaron Mackay BURNS
  • Patent number: 11893964
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: January 20, 2023
    Date of Patent: February 6, 2024
    Assignee: Apple Inc.
    Inventors: Aaron Mackay Burns, Alexis H. Palangie, Pol Pla I Conesa, David M. Schattel
  • Patent number: 11842449
    Abstract: In an exemplary process, a computer-generated reality environment comprising a virtual object is presented and user movement that occurs in a physical environment is detected. In response to determining that the detected user movement is toward the virtual object and that the virtual object obstructs a real object from the physical environment, a determination is made whether the detected user movement is directed to the virtual object or the real object. In accordance with a determination that the detected user movement is directed to the real object, a visual appearance of the virtual object is modified, where modifying the visual appearance of the virtual object comprises displaying presenting at least a portion of the real object. In accordance with a determination that the detected user movement is directed to the virtual object, the presentation of the virtual object is maintained to obstruct the real object.
    Type: Grant
    Filed: August 25, 2020
    Date of Patent: December 12, 2023
    Assignee: Apple Inc.
    Inventors: Alexis Palangie, Aaron Mackay Burns
  • Publication number: 20230334793
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: January 20, 2023
    Publication date: October 19, 2023
    Inventors: Aaron Mackay BURNS, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Patent number: 11768546
    Abstract: In one implementation, a method for visually indicating positional/rotational information of a finger-wearable device. The method includes: determining a set of translational values and a set of rotational values for the finger-wearable device, wherein the finger-wearable device is worn on a finger of a user; displaying, via a display, a visual representation of a location of the finger-wearable device based on the set of translational values; generating a visual representation of a grasp region of the user based on the set of translational values and the set of rotational values; and concurrently displaying, via the display, the visual representation of the grasp region with the visual representation of the location of the finger-wearable device.
    Type: Grant
    Filed: August 22, 2021
    Date of Patent: September 26, 2023
    Assignee: APPLE INC.
    Inventors: Adam Gabriel Poulos, Benjamin Rolf Blachnitzky, Nicolai Philip Georg, Arun Rakesh Yoganandan, Aaron Mackay Burns
  • Publication number: 20230035941
    Abstract: Systems and processes for speech interpretation based on environmental context are provided. For example, a user gaze direction is detected, and a speech input is received from a first user of the electronic device. In accordance with a determination that the user gaze is directed at a digital assistant object, the speech input is processed by the digital assistant. In accordance with a determination that the user gaze is not directed at a digital assistant object, contextual information associated with the electronic device is obtained, wherein the contextual information includes speech from a second user. Determination is made whether the speech input is directed to a digital assistant of the electronic device. In accordance with a determination that the speech input is directed to a digital assistant of the electronic device, the speech input is processed by the digital assistant.
    Type: Application
    Filed: October 13, 2021
    Publication date: February 2, 2023
    Inventors: Brad Kenneth HERMAN, Shiraz AKMAL, Aaron Mackay BURNS
  • Patent number: 11521581
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: August 9, 2021
    Date of Patent: December 6, 2022
    Assignee: Apple Inc.
    Inventors: Aaron Mackay Burns, Alexis H. Palangie, Pol Pla I Conesa, David M. Schattel
  • Publication number: 20220121344
    Abstract: In some embodiments, an electronic device enhances interactions with virtual objects in a three-dimensional environment. In some embodiments, an electronic device enhances interactions with selectable user interface elements. In some embodiments, an electronic device enhances interactions with slider user interface elements. In some embodiments, an electronic device moves virtual objects in a three-dimensional environment and facilitates accessing actions associated with virtual objects.
    Type: Application
    Filed: September 25, 2021
    Publication date: April 21, 2022
    Inventors: Israel PASTRANA VICENTE, Jonathan R. DASCOLA, Wesley M. HOLDER, Alexis Henri PALANGIE, Aaron Mackay BURNS, Pol PLA I CONESA, William A. SORRENTINO, III, Stephen O. LEMAY, Christopher D. MCKENZIE, Shih-Sang Chiu, Benjamin Hunter Boesel, Jonathan Ravasz
  • Publication number: 20210365108
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: August 6, 2021
    Publication date: November 25, 2021
    Inventors: Aaron Mackay BURNS, Nathan GITTER, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20210366440
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: August 9, 2021
    Publication date: November 25, 2021
    Inventors: Aaron Mackay BURNS, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20210216146
    Abstract: A method includes detecting, via a first one of a plurality of input devices, a primary input directed to a first candidate virtual spatial location of a computer-generated reality (CGR) environment. The first candidate virtual spatial location is an output of an extremity tracking function based on the primary input. The method includes detecting, via a second one of the plurality of input devices, a secondary input directed to a second candidate virtual spatial location of the CGR environment. The second candidate virtual spatial location is an output of an eye tracking function based on the secondary input. The method includes positioning a user-controlled spatial selector to a virtual spatial location of the CGR environment as a function of the first and second candidate virtual spatial locations.
    Type: Application
    Filed: January 13, 2021
    Publication date: July 15, 2021
    Inventors: Aaron Mackay Burns, Jordan Alexander Cazamias, Nicolai Philip Georg
  • Publication number: 20210097729
    Abstract: In one implementation, a method of resolving focal conflict in a computer-generated reality (CGR) environment is performed by a device including a processor, non-transitory memory, an image sensor, and a display. The method includes capturing, using the image sensor, an image of a scene including a real object in a particular direction at a first distance from the device. The method includes displaying, on the display, a CGR environment including a virtual object in the particular direction at a second distance from the device. In accordance with a determination that the second distance is less than the first distance, the CGR environment includes the virtual object overlaid on the scene. In accordance with a determination that the second distance is greater than the first distance, the CGR environment includes the virtual object with an obfuscation area that obfuscates at least a portion of the real object within the obfuscation area.
    Type: Application
    Filed: June 23, 2020
    Publication date: April 1, 2021
    Inventors: Alexis Henri Palangie, Shih Sang Chiu, Bruno M. Sommer, Connor Alexander Smith, Aaron Mackay Burns
  • Publication number: 20210097766
    Abstract: In an exemplary process, a computer-generated reality environment comprising a virtual object is presented and user movement that occurs in a physical environment is detected. In response to determining that the detected user movement is toward the virtual object and that the virtual object obstructs a real object from the physical environment, a determination is made whether the detected user movement is directed to the virtual object or the real object. In accordance with a determination that the detected user movement is directed to the real object, a visual appearance of the virtual object is modified, where modifying the visual appearance of the virtual object comprises displaying presenting at least a portion of the real object. In accordance with a determination that the detected user movement is directed to the virtual object, the presentation of the virtual object is maintained to obstruct the real object.
    Type: Application
    Filed: August 25, 2020
    Publication date: April 1, 2021
    Inventors: Alexis PALANGIE, Aaron Mackay BURNS
  • Patent number: 10754496
    Abstract: Examples are disclosed herein that relate to receiving virtual reality input. An example provides a head-mounted display device comprising a sensor system, a display, a logic machine, and a storage machine holding instructions executable by the logic machine. The instructions are executable to execute a 3D virtual reality experience on the head-mounted display device, track, via the sensor system, a touch-sensitive input device, render, on the display, in a 3D location in the 3D virtual reality experience based on the tracking of the touch-sensitive input device, a user interface, receive, via a touch sensor of the touch-sensitive input device, a user input, and, in response to receiving the user input, control the 3D virtual reality experience to thereby vary visual content being rendered on the display.
    Type: Grant
    Filed: August 24, 2017
    Date of Patent: August 25, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Liam Kiemele, Michael Robert Thomas, Alexandre Da Veiga, Christian Michael Sadak, Bryant Daniel Hawthorne, Aaron D. Krauss, Aaron Mackay Burns
  • Patent number: 10545900
    Abstract: In various embodiments, methods and systems are provide for detecting a physical configuration of a device based on sensor data from one or more configuration sensors. The physical configuration includes a position of a first display region of the device with respect to a second display region of the device, where the position is physically adjustable. A configuration profile is selected from a plurality of configuration profiles based on the detected physical configuration of the device. Each configuration profile is a representation of at least one respective physical configuration of the device. An interaction mode corresponding to the selected configuration profile is activated, where the interaction mode includes a set of mode input/output (I/O) features available while the interaction mode is active. Device interfaces of the device are managed using at least some mode I/O features in the set of mode I/O features based on the activating of the interaction mode.
    Type: Grant
    Filed: September 23, 2016
    Date of Patent: January 28, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Aaron Mackay Burns, Riccardo Giraldi, Christian Klein, Roger Sebastian Kevin Sylvan, John Benjamin George Hesketh, Scott G Wade
  • Publication number: 20190251884
    Abstract: In many computing scenarios, multiple users share a display to view and/or interact with content. Typically, one user provides input that interacts with the content; a second user can interact with the view only if the first user cedes control. Some interfaces permit split views, but typically support only a single input device that manipulates both panes. In the present disclosure, when a second user desires a different view of content during the first user's interaction, the device inserts a second view of the content into the display. Each view is associated with and manipulated by a particular user (e.g., via input devices associated with individual users) without altering the views of other users. The device may automatically manage the concurrent views, such as positioning and resizing; reflecting each user's perspective in other users' views; merging content changes; and terminating a view due to idleness or merging with another view.
    Type: Application
    Filed: February 14, 2018
    Publication date: August 15, 2019
    Inventors: Aaron Mackay Burns, John Benjamin HESKETH, Donna Katherine LONG, Jamie Ruth CABACCANG, Kathleen Patricia MULCAHY, Timothy David KVIZ
  • Publication number: 20190065026
    Abstract: Examples are disclosed herein that relate to receiving virtual reality input. An example provides a head-mounted display device comprising a sensor system, a display, a logic machine, and a storage machine holding instructions executable by the logic machine. The instructions are executable to execute a 3D virtual reality experience on the head-mounted display device, track, via the sensor system, a touch-sensitive input device, render, on the display, in a 3D location in the 3D virtual reality experience based on the tracking of the touch-sensitive input device, a user interface, receive, via a touch sensor of the touch-sensitive input device, a user input, and, in response to receiving the user input, control the 3D virtual reality experience to thereby vary visual content being rendered on the display.
    Type: Application
    Filed: August 24, 2017
    Publication date: February 28, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Liam KIEMELE, Michael Robert THOMAS, Alexandre DA VEIGA, Christian Michael SADAK, Bryant Daniel HAWTHORNE, Aaron D. KRAUSS, Aaron Mackay BURNS
  • Publication number: 20180348518
    Abstract: Tracking a user head position detects a change to a new head position and, in response, a remote camera is instructed to move to a next camera position. A camera image frame, having an indication of camera position, is received from the camera. Upon the camera position not aligning with the next camera position, an assembled image frame is formed, using image data from past views, and rendered to appear to the user as if the camera moved in 1:1 alignment with the user's head to the next camera position.
    Type: Application
    Filed: June 5, 2017
    Publication date: December 6, 2018
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandre DA VEIGA, Roger Sebastian Kevin SYLVAN, Kenneth Liam KIEMELE, Nikolai Michael FAALAND, Aaron Mackay BURNS
  • Patent number: 10139631
    Abstract: Tracking a user head position detects a change to a new head position and, in response, a remote camera is instructed to move to a next camera position. A camera image frame, having an indication of camera position, is received from the camera. Upon the camera position not aligning with the next camera position, an assembled image frame is formed, using image data from past views, and rendered to appear to the user as if the camera moved in 1:1 alignment with the user's head to the next camera position.
    Type: Grant
    Filed: June 5, 2017
    Date of Patent: November 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alexandre Da Veiga, Roger Sebastian Kevin Sylvan, Kenneth Liam Kiemele, Nikolai Michael Faaland, Aaron Mackay Burns