Patents by Inventor Timothy R. Oriol

Timothy R. Oriol has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12141414
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide a CGR environment in which virtual objects from one or more apps are included. User interactions with the virtual objects are detected and interpreted by a system that is separate from the apps that provide the virtual objects. The system detects user interactions received via one or more input modalities and interprets those user interactions as events. These events provide a higher-level, input modality-independent, abstractions of the lower-level input-modality dependent user interactions that are detected. The system uses UI capability data provided by the apps to interpret user interactions with respect to the virtual object provided by the apps. For example, the UI capability data can identify whether a virtual object is moveable, actionable, hover-able, etc. and the system interprets user interactions at or near the virtual object accordingly.
    Type: Grant
    Filed: July 3, 2023
    Date of Patent: November 12, 2024
    Assignee: Apple Inc.
    Inventors: Edwin Iskandar, Ittinop Dumnernchanvanit, Samuel L. Iglesias, Timothy R. Oriol
  • Patent number: 12086919
    Abstract: A mixed reality system that includes a device and a base station that communicate via a wireless connection The device may include sensors that collect information about the user's environment and about the user. The information collected by the sensors may be transmitted to the base station via the wireless connection. The base station renders frames or slices based at least in part on the sensor information received from the device, encodes the frames or slices, and transmits the compressed frames or slices to the device for decoding and display. The base station may provide more computing power than conventional stand-alone systems, and the wireless connection does not tether the device to the base station as in conventional tethered systems. The system may implement methods and apparatus to maintain a target frame rate through the wireless link and to minimize latency in frame rendering, transmittal, and display.
    Type: Grant
    Filed: June 29, 2023
    Date of Patent: September 10, 2024
    Assignee: Apple Inc.
    Inventors: Arthur Y Zhang, Ray L. Chang, Timothy R. Oriol, Ling Su, Gurjeet S. Saund, Guy Cote, Jim C. Chou, Hao Pan, Tobias Eble, Avi Bar-Zeev, Sheng Zhang, Justin A. Hensley, Geoffrey Stahl
  • Publication number: 20240211279
    Abstract: The present disclosure relates to providing a computer-generated reality (CGR) platform for generating CGR environments including virtual and augmented reality environments. In some embodiments, information related to an object to be simulated and rendered in the CGR environment is provided to the CGR platform, and a three-dimensional representation of the object is displayed in the CGR environment.
    Type: Application
    Filed: January 12, 2024
    Publication date: June 27, 2024
    Inventors: Helmut GARSTENAUER, Martin GARSTENAUER, Edwin ISKANDAR, Timothy R. ORIOL, Geoffrey STAHL, Cody J. WHITE
  • Patent number: 12010155
    Abstract: A device for providing operating system managed group communication sessions may include a memory and at least one processor. The at least one processor may be configured to receive, by an operating system level process executing on a device and from an application process executing on a device, a request to initiate a group session between a user associated with the device and another user. The at least one processor may be further configured to identify, by the operating system level process, another device associated with the other user. The at least one processor may be further configured to initiate, by the operating system level process, the group session with the user via the other device. The at least one processor may be further configured to manage, by the operating system level process, the group session.
    Type: Grant
    Filed: April 6, 2021
    Date of Patent: June 11, 2024
    Assignee: Apple Inc.
    Inventors: Geoffrey Stahl, Jeffrey S. Norris, Timothy R. Oriol, Joel N. Kerr, Srinivas Vedula, Bruno Sommer
  • Patent number: 11875162
    Abstract: The present disclosure relates to providing a computer-generated reality (CGR) platform for generating CGR environments including virtual and augmented reality environments. In some embodiments, information related to an object to be simulated and rendered in the CGR environment is provided to the CGR platform, and a three-dimensional representation of the object is displayed in the CGR environment.
    Type: Grant
    Filed: June 21, 2022
    Date of Patent: January 16, 2024
    Assignee: Apple Inc.
    Inventors: Helmut Garstenauer, Martin Garstenauer, Edwin Iskandar, Timothy R. Oriol, Geoffrey Stahl, Cody J. White
  • Publication number: 20230394170
    Abstract: A first device coupled with a first display and an image sensor receives output data from a second device having a second display different from the first display. The output data represents content displayable by the second device on the second display. The first device determines, using the image sensor, a position of the second display relative to the first device and causes the first display to display content based on the output data received from the second device and the determined position of the second display relative to the first device.
    Type: Application
    Filed: August 16, 2023
    Publication date: December 7, 2023
    Inventors: Clément Pierre Nicolas BOISSIÈRE, Timothy R. ORIOL
  • Publication number: 20230351672
    Abstract: A mixed reality system that includes a device and a base station that communicate via a wireless connection The device may include sensors that collect information about the user’s environment and about the user. The information collected by the sensors may be transmitted to the base station via the wireless connection. The base station renders frames or slices based at least in part on the sensor information received from the device, encodes the frames or slices, and transmits the compressed frames or slices to the device for decoding and display. The base station may provide more computing power than conventional stand-alone systems, and the wireless connection does not tether the device to the base station as in conventional tethered systems. The system may implement methods and apparatus to maintain a target frame rate through the wireless link and to minimize latency in frame rendering, transmittal, and display.
    Type: Application
    Filed: June 29, 2023
    Publication date: November 2, 2023
    Applicant: Apple Inc.
    Inventors: Arthur Y. Zhang, Ray L. Chang, Timothy R. Oriol, Ling Su, Gurjeet S. Saund, Guy Cote, Jim C. Chou, Hao Pan, Tobias Eble, Avi Bar-Zeev, Sheng Zhang, Justin A. Hensley, Geoffrey Stahl
  • Publication number: 20230350538
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide a CGR environment in which virtual objects from one or more apps are included. User interactions with the virtual objects are detected and interpreted by a system that is separate from the apps that provide the virtual objects. The system detects user interactions received via one or more input modalities and interprets those user interactions as events. These events provide a higher-level, input modality-independent, abstractions of the lower-level input-modality dependent user interactions that are detected. The system uses UI capability data provided by the apps to interpret user interactions with respect to the virtual object provided by the apps. For example, the UI capability data can identify whether a virtual object is moveable, actionable, hover-able, etc. and the system interprets user interactions at or near the virtual object accordingly.
    Type: Application
    Filed: July 3, 2023
    Publication date: November 2, 2023
    Inventors: Edwin Iskandar, Ittinop Dumnernchanvanit, Samuel L. Iglesias, Timothy R. Oriol
  • Publication number: 20230325140
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, a placement point for a selected object is designated at a first position based on a gaze position. In response to a user input, the placement point is moved to a second position that is not based on the gaze position, and the object is placed at the second position.
    Type: Application
    Filed: June 14, 2023
    Publication date: October 12, 2023
    Inventors: Avi BAR-ZEEV, Ryan S. BURGOYNE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Rahul NAIR, Timothy R. ORIOL, Alexis H. PALANGIE
  • Patent number: 11768956
    Abstract: A first device coupled with a first display and an image sensor receives output data from a second device having a second display different from the first display. The output data represents content displayable by the second device on the second display. The first device determines, using the image sensor, a position of the second display relative to the first device and causes the first display to display content based on the output data received from the second device and the determined position of the second display relative to the first device.
    Type: Grant
    Filed: December 23, 2021
    Date of Patent: September 26, 2023
    Assignee: Apple Inc.
    Inventors: Clément Pierre Nicolas Boissière, Timothy R. Oriol
  • Patent number: 11733824
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide a CGR environment in which virtual objects from one or more apps are included. User interactions with the virtual objects are detected and interpreted by a system that is separate from the apps that provide the virtual objects. The system detects user interactions received via one or more input modalities and interprets those user interactions as events. These events provide a higher-level, input modality-independent, abstractions of the lower-level input-modality dependent user interactions that are detected. The system uses UI capability data provided by the apps to interpret user interactions with respect to the virtual object provided by the apps. For example, the UI capability data can identify whether a virtual object is moveable, actionable, hover-able, etc. and the system interprets user interactions at or near the virtual object accordingly.
    Type: Grant
    Filed: June 13, 2019
    Date of Patent: August 22, 2023
    Assignee: Apple Inc.
    Inventors: Edwin Iskandar, Ittinop Dumnernchanvanit, Samuel L. Iglesias, Timothy R. Oriol
  • Patent number: 11727619
    Abstract: A mixed reality system that includes a device and a base station that communicate via a wireless connection The device may include sensors that collect information about the user's environment and about the user. The information collected by the sensors may be transmitted to the base station via the wireless connection. The base station renders frames or slices based at least in part on the sensor information received from the device, encodes the frames or slices, and transmits the compressed frames or slices to the device for decoding and display. The base station may provide more computing power than conventional stand-alone systems, and the wireless connection does not tether the device to the base station as in conventional tethered systems. The system may implement methods and apparatus to maintain a target frame rate through the wireless link and to minimize latency in frame rendering, transmittal, and display.
    Type: Grant
    Filed: June 18, 2021
    Date of Patent: August 15, 2023
    Assignee: Apple Inc.
    Inventors: Arthur Y Zhang, Ray L. Chang, Timothy R. Oriol, Ling Su, Gurjeet S. Saund, Guy Cote, Jim C. Chou, Hao Pan, Tobias Eble, Avi Bar-Zeev, Sheng Zhang, Justin A. Hensley, Geoffrey Stahl
  • Patent number: 11714592
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.
    Type: Grant
    Filed: September 27, 2021
    Date of Patent: August 1, 2023
    Assignee: Apple Inc.
    Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair, Timothy R. Oriol, Alexis H. Palangie
  • Publication number: 20230206912
    Abstract: Systems and processes for operating a digital assistant are provided. An example method includes, at an electronic device with one or more processors and memory, while an application is open on the electronic device: receiving a spoken input including a command, determining whether the command matches at least a portion of a metadata associated with an action of the application, and in accordance with a determination that the command matches at least the portion of the metadata associated with the action of the application, associating the command with the action, storing the association of the command with the action for subsequent use with the application by the digital assistant, and executing the action with the application.
    Type: Application
    Filed: February 27, 2023
    Publication date: June 29, 2023
    Inventors: Kurt W. PIERSOL, Cedric BRAY, Keith S. BRISSON, Helmut GARSTENAUER, Timothy R. ORIOL, Jessica J. PECK, Lewis N. PERKINS, Luca SIMONELLI, Nathan D. TAYLOR
  • Publication number: 20220318035
    Abstract: The present disclosure relates to providing a computer-generated reality (CGR) platform for generating CGR environments including virtual and augmented reality environments. In some embodiments, information related to an object to be simulated and rendered in the CGR environment is provided to the CGR platform, and a three-dimensional representation of the object is displayed in the CGR environment.
    Type: Application
    Filed: June 21, 2022
    Publication date: October 6, 2022
    Inventors: Helmut GARSTENAUER, Martin GARSTENAUER, Edwin ISKANDAR, Timothy R. ORIOL, Geoffrey STAHL, Cody J. WHITE
  • Publication number: 20220269338
    Abstract: Implementations use a first device (e.g., an HMD) to provide a CGR environment that augments the input and output capabilities of a second device, e.g., a laptop, smart speaker, etc. In some implementations, the first device communicates with a second device in its proximate physical environment to exchange input or output data. For example, an HMD may capture an image of a physical environment that includes a laptop. The HMD may detect the laptop, send a request the laptop's content, receive content from the laptop (e.g., the content that the laptop is currently displaying and additional content), identify the location of the laptop, and display a virtual object with the received content in the CGR environment on or near the laptop. The size, shape, orientation, or position of the virtual object (e.g., a virtual monitor or monitor extension) may also be configured to provide a better user experience.
    Type: Application
    Filed: May 12, 2022
    Publication date: August 25, 2022
    Inventors: Adam M. O'HERN, Eddie G. MENDOZA, Mohamed Selim BEN HIMANE, Timothy R. ORIOL
  • Publication number: 20220245906
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.
    Type: Application
    Filed: April 22, 2022
    Publication date: August 4, 2022
    Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol
  • Patent number: 11379033
    Abstract: Implementations use a first device (e.g., an HMD) to provide a CGR environment that augments the input and output capabilities of a second device, e.g., a laptop, smart speaker, etc. In some implementations, the first device communicates with a second device in its proximate physical environment to exchange input or output data. For example, an HMD may capture an image of a physical environment that includes a laptop. The HMD may detect the laptop, send a request the laptop's content, receive content from the laptop (e.g., the content that the laptop is currently displaying and additional content), identify the location of the laptop, and display a virtual object with the received content in the CGR environment on or near the laptop. The size, shape, orientation, or position of the virtual object (e.g., a virtual monitor or monitor extension) may also be configured to provide a better user experience.
    Type: Grant
    Filed: September 14, 2020
    Date of Patent: July 5, 2022
    Assignee: Apple Inc.
    Inventors: Adam M. O'Hern, Eddie G. Mendoza, Mohamed Selim Ben Himane, Timothy R. Oriol
  • Patent number: 11372655
    Abstract: The present disclosure relates to providing a computer-generated reality (CGR) platform for generating CGR environments including virtual and augmented reality environments. In some embodiments, the platform includes an operating-system-level (OS-level) process that simulates and renders content in the CGR environment, and one or more application-level processes that provide information related to the content to be simulated and rendered to the OS-level process.
    Type: Grant
    Filed: February 25, 2020
    Date of Patent: June 28, 2022
    Assignee: Apple Inc.
    Inventors: Helmut Garstenauer, Martin Garstenauer, Edwin Iskandar, Timothy R. Oriol, Geoffrey Stahl, Cody J. White
  • Patent number: 11348316
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.
    Type: Grant
    Filed: August 23, 2019
    Date of Patent: May 31, 2022
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol