Patents by Inventor Michael J. Rockwell

Michael J. Rockwell has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Publication number: 20240144533
    Abstract: Various implementations disclosed herein include devices, systems, and methods that track a movement of an input device. For example, an example process may include determine a pose of a tracking device in a physical environment based on first sensor data from an image sensor. The process then may receive, from the tracking device, first positional data corresponding to a first relative positioning between the tracking device and an input device in the physical environment, where the first positional data is determined based on second sensor data obtained via a sensor on the tracking device. The process then may track movement of the input device in the physical environment based at least in part on the first positional data and the pose of the tracking device. The process then may determine an input for the electronic device based at least in part on tracking the movement of the input device.
    Type: Application
    Filed: January 10, 2024
    Publication date: May 2, 2024
    Inventors: Jeffrey S. Norris, Michael J. Rockwell, Tony Kobayashi, William D. Lindmeier
  • Publication number: 20240112391
    Abstract: Techniques are disclosed relating to display devices. In some embodiments, a display device includes a display system configured to display three-dimensional content to a user. The display device is configured to discover, via a network interface, one or more compute nodes operable to facilitate rendering the three-dimensional content and receive information identifying abilities of the one or more compute nodes to facilitate the rendering. Based on the received information, the display device evaluates a set of tasks to identify one or more of the tasks to offload to the one or more compute nodes for facilitating the rendering and distributes, via the network interface, the identified one or more tasks to the one or more compute nodes for processing by the one or more compute nodes.
    Type: Application
    Filed: October 10, 2023
    Publication date: April 4, 2024
    Inventors: Ranjit Desai, Michael J. Rockwell
  • Publication number: 20240111162
    Abstract: A method includes sensing a plurality of light superposition characteristic values associated with ambient light from a physical environment. The ambient light emanates from the physical environment towards one side of a translucent display. The plurality of light superposition characteristic values quantifies interactions with the ambient light. The method includes determining a plurality of display correction values associated with the electronic device based on a function of the plurality of light superposition characteristic values and predetermined display characteristics of a computer-generated reality (CGR) object. The method includes changing one or more display operating parameters associated with the electronic device in accordance with the plurality of display correction values in order to satisfy the predetermined display characteristics of the CGR object within a performance threshold.
    Type: Application
    Filed: December 13, 2023
    Publication date: April 4, 2024
    Inventors: Anselm Grundhoefer, Michael J. Rockwell
  • Publication number: 20240103682
    Abstract: A computer system displays a first application user interface at a first location in a three-dimensional environment. While displaying the first application user interface at the first location in the three-dimensional environment, the computer system detects, at a first time, a first input corresponding to a request to close the first application user interface. In response to detecting the first input corresponding to a request to close the first application user interface: the computer system closes the first application user interface, including ceasing to display the first application user interface in the three-dimensional environment; and, in accordance with a determination that respective criteria are met, the computer system displays a home menu user interface at a respective home menu position that is determined based on the first location of the first application user interface in the three-dimensional environment.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Stephen O. Lemay, Zoey C. Taylor, Benjamin Hylak, Willliam A. Sorrentino, III, Jonathan Ravasz, Peter D. Anton, Michael J. Rockwell
  • Publication number: 20240103707
    Abstract: In some embodiments, a computer system applies a time of day setting to a virtual environment. In some embodiments, the time of day setting is updated based on an event. In some embodiments, a computer system displays content in an expanded display mode. In some embodiments, computer systems join a communication session while maintaining display of respective environments. In some embodiments, a computer system moves a portal based on user movement. In some embodiments, computer systems share a virtual environment. Computer systems can display media with simulated lighting. Computer systems can share an environment. In some embodiments, a computer system selects a position relative to content. A computer system can present representations of communication session participants based on content. A computer system can present user interfaces to control visual appearances of an environment including media. Computer systems can change an appearance of an environment based on environmental modes.
    Type: Application
    Filed: September 24, 2023
    Publication date: March 28, 2024
    Inventors: Nicholas W. HENDERSON, James M. DESSERO, Matan SAUBER, Stephen O. LEMAY, Jeffrey S. ALLEN, Michael A. DUNKLEY, Michael J. ROCKWELL, William A. SORRENTINO, III, Hugh A. SIDER, Magnus DANIELSSON
  • Publication number: 20240098234
    Abstract: A head-mounted device is provided that includes one or more cameras configured to acquire a raw video feed and one or more displays configured to present a passthrough video feed to a user. Generation of the passthrough video feed can involve processing the raw video feed using an image signal processor and auxiliary compute blocks. One or more of the auxiliary compute blocks can be bypassed in response to detecting one or more failures associated with the auxiliary compute blocks. Configured and operated in this way, the head-mounted device can fall back to a more reliable passthrough video feed without having to power cycle the head-mounted device when a failure occurs.
    Type: Application
    Filed: September 6, 2023
    Publication date: March 21, 2024
    Inventors: Michael C Friedman, Russell L Jones, Kaushik Raghunath, Venu M Duggineni, Ranjit Desai, Manjunath M Venkatesh, Michael J Rockwell, Arun Kannan, Saul H Weiss
  • Publication number: 20240071075
    Abstract: An exemplary process for identifying a type of a physical environment amongst a plurality of types of physical environments is provided. The process includes obtaining, using the one or more cameras, image data corresponding to a physical environment. The process further includes identifying at least one portion of an entity in the physical environment based on the image data; determining, based on the identified at least one portion of the entity, whether the entity is an entity of a first type; determining a type of the physical environment if the entity is an entity of the first type; and presenting one or more virtual objects and a representation of the entity.
    Type: Application
    Filed: September 11, 2023
    Publication date: February 29, 2024
    Inventors: Peter MEIER, Michael J. ROCKWELL
  • Patent number: 11914780
    Abstract: A finger-mounted device may include finger-mounted units. The finger-mounted units may each have a body that serves as a support structure for components such as force sensors, accelerometers, and other sensors and for haptic output devices. The body may have sidewall portions coupled by a portion that rests adjacent to a user's fingernail. The body may be formed from deformable material such as metal or may be formed from adjustable structures such as sliding body portions that are coupled to each other using magnetic attraction, springs, or other structures. The body of each finger-mounted unit may have a U-shaped cross-sectional profile that leaves the finger pad of each finger exposed when the body is coupled to a fingertip of a user's finger. Control circuitry may gather finger press input, lateral finger movement input, and finger tap input using the sensors and may provide haptic output using the haptic output device.
    Type: Grant
    Filed: August 11, 2022
    Date of Patent: February 27, 2024
    Assignee: Apple Inc.
    Inventors: Paul X Wang, Alex J. Lehmann, Michael J. Rockwell, Michael Y. Cheung, Ray L. Chang, Hongcheng Sun, Ian M. Bullock, Kyle J. Nekimken, Madeleine S. Cordier, Seung Wook Kim, David H. Bloom, Scott G. Johnston
  • Patent number: 11892632
    Abstract: A method includes sensing a plurality of light superposition characteristic values associated with ambient light from a physical environment. The ambient light emanates from the physical environment towards one side of a translucent display. The plurality of light superposition characteristic values quantifies interactions with the ambient light. The method includes determining a plurality of display correction values associated with the electronic device based on a function of the plurality of light superposition characteristic values and predetermined display characteristics of a computer-generated reality (CGR) object. The method includes changing one or more display operating parameters associated with the electronic device in accordance with the plurality of display correction values in order to satisfy the predetermined display characteristics of the CGR object within a performance threshold.
    Type: Grant
    Filed: September 22, 2020
    Date of Patent: February 6, 2024
    Assignee: APPLE INC.
    Inventors: Anselm Grundhoefer, Michael J. Rockwell
  • Publication number: 20240022705
    Abstract: An electronic device may include a lenticular display. The lenticular display may have a lenticular lens film formed over an array of pixels. The lenticular lenses may be configured to enable stereoscopic viewing of the display such that a viewer perceives three-dimensional images. The display may have a number of independently controllable viewing zones. A eye and/or head tracking system may use a camera to capture images of a viewer of the display. Control circuitry in the electronic device may use the captured images from the eye and/or head tracking system to determine which viewing zones are occupied by the viewer's eyes. The control circuitry may disable or dim viewing zones that are not occupied by the viewer's eyes in order to conserve power. An unoccupied viewing zone and an adjacent, occupied viewing zone may display the same image to increase sharpness in the display.
    Type: Application
    Filed: July 20, 2023
    Publication date: January 18, 2024
    Inventors: Yi-Pai Huang, Felipe Bacim De Araujo E Silva, Ping-Yen Chou, ByoungSuk Kim, Chaohao Wang, Yi Huang, Michael J. Rockwell, Seung Wook Kim, Pretesh A. Mascarenhas
  • Publication number: 20230413472
    Abstract: Aspects of the subject technology provide for dynamic noise control for electronic devices. For example, a dynamically adjustable limit on component noise may be generated based on ambient noise, based on other device-generated sound, based on a state of an audio output device, and/or based on an application being actively utilized at an electronic device. As one example, an electronic device may increase a limit on a sound-generating component of an electronic device when the electronic device determines that a user of the electronic device is engaged in a sound-tolerant activity. As another example, an electronic device may decrease a limit on a sound-generating component of an electronic device when the electronic device determines that a user of the electronic device is engaged in a sound-sensitive activity.
    Type: Application
    Filed: April 14, 2023
    Publication date: December 21, 2023
    Inventors: Laura M. CAMPO, David A. SUMBERG, Michael J. ROCKWELL
  • Publication number: 20230386149
    Abstract: In one implementation, a method includes: instantiating a first objective-effectuator (OE) associated with first attributes and a second OE associated with second attributes into a synthesized reality (SR) setting, wherein the first OE is encapsulated within the second OE; providing a first objective to the first OE based on the first and second attributes; providing a second objective to the second OE based on the second attributes, wherein the first and second objectives are associated with a time period between a first and second temporal points; generating a first set of actions for the first OE based on the first objective and a second set of actions for the second OE based on the second objective; and rendering for display the SR setting for the time period including the first set of actions performed by the first OE and the second set of actions performed by the second OE.
    Type: Application
    Filed: August 14, 2023
    Publication date: November 30, 2023
    Inventors: Ian M. Richter, Michael J. Rockwell, Amritpal Singh Saini, Olivier Soares
  • Patent number: 11816776
    Abstract: Techniques are disclosed relating to display devices. In some embodiments, a display device includes a display system configured to display three-dimensional content to a user. The display device is configured to discover, via a network interface, one or more compute nodes operable to facilitate rendering the three-dimensional content and receive information identifying abilities of the one or more compute nodes to facilitate the rendering. Based on the received information, the display device evaluates a set of tasks to identify one or more of the tasks to offload to the one or more compute nodes for facilitating the rendering and distributes, via the network interface, the identified one or more tasks to the one or more compute nodes for processing by the one or more compute nodes.
    Type: Grant
    Filed: February 28, 2022
    Date of Patent: November 14, 2023
    Assignee: Apple Inc.
    Inventors: Ranjit Desai, Michael J. Rockwell
  • Patent number: 11769305
    Abstract: In one implementation, a method includes: instantiating a first objective-effectuator (OE) associated with first attributes and a second OE associated with second attributes into a synthesized reality (SR) setting, wherein the first OE is encapsulated within the second OE; providing a first objective to the first OE based on the first and second attributes; providing a second objective to the second OE based on the second attributes, wherein the first and second objectives are associated with a time period between a first and second temporal points; generating a first set of actions for the first OE based on the first objective and a second set of actions for the second OE based on the second objective; and rendering for display the SR setting for the time period including the first set of actions performed by the first OE and the second set of actions performed by the second OE.
    Type: Grant
    Filed: December 21, 2021
    Date of Patent: September 26, 2023
    Assignee: APPLE INC.
    Inventors: Ian M. Richter, Michael J. Rockwell, Amritpal Singh Saini, Olivier Soares
  • Publication number: 20230293998
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide shared extended reality (XR) experiences in which two or more users interact with their own sets of physical objects (e.g., cards, game pieces, dice, chips, etc.) during the shared experiences. Each user may have multiple physical objects, each of those physical objects having the same generic shape, size, etc., but having a unique fiducial marker. The unique fiducial marker of each physical object can be assigned to represent one of multiple virtual content items.
    Type: Application
    Filed: March 17, 2023
    Publication date: September 21, 2023
    Inventors: Geoffrey Stahl, Michael J. Rockwell
  • Patent number: 11756294
    Abstract: An exemplary process for identifying a type of a physical environment amongst a plurality of types of physical environments is provided. The process includes obtaining, using the one or more cameras, image data corresponding to a physical environment. The process further includes identifying at least one portion of an entity in the physical environment based on the image data; determining, based on the identified at least one portion of the entity, whether the entity is an entity of a first type; determining a type of the physical environment if the entity is an entity of the first type; and presenting one or more virtual objects and a representation of the entity.
    Type: Grant
    Filed: August 9, 2021
    Date of Patent: September 12, 2023
    Assignee: Apple Inc.
    Inventors: Peter Meier, Michael J. Rockwell
  • Patent number: 11750795
    Abstract: An electronic device may include a lenticular display. The lenticular display may have a lenticular lens film formed over an array of pixels. The lenticular lenses may be configured to enable stereoscopic viewing of the display such that a viewer perceives three-dimensional images. The display may have a number of independently controllable viewing zones. A eye and/or head tracking system may use a camera to capture images of a viewer of the display. Control circuitry in the electronic device may use the captured images from the eye and/or head tracking system to determine which viewing zones are occupied by the viewer's eyes. The control circuitry may disable or dim viewing zones that are not occupied by the viewer's eyes in order to conserve power. An unoccupied viewing zone and an adjacent, occupied viewing zone may display the same image to increase sharpness in the display.
    Type: Grant
    Filed: April 15, 2021
    Date of Patent: September 5, 2023
    Assignee: Apple Inc.
    Inventors: Yi-Pai Huang, Felipe Bacim De Araujo E Silva, Ping-Yen Chou, Byoungsuk Kim, Chaohao Wang, Yi Huang, Michael J. Rockwell, Seung Wook Kim, Pretesh A. Mascarenhas
  • Publication number: 20230095816
    Abstract: Aspects of the subject technology provide electronic devices that operate, in part, based on enrolled user characteristics, and that can be operated by a guest user that has not been enrolled. For example, upon determining that a current user of an electronic device storing a first physical model of a primary user is a guest user different from the primary user, the electronic device may obtain initial physical characteristic data for the guest user and generate a guest physical model of the guest user based on the initial physical characteristic data. In one or more implementations, the electronic device may operate based on guest user inputs and the guest physical model of the guest user, while updating the guest physical model based on the guest user inputs.
    Type: Application
    Filed: September 16, 2022
    Publication date: March 30, 2023
    Inventors: David COHEN, Kyle C. BROGLE, Michael J. ROCKWELL, Ranjit DESAI, Joel N. KERR, Amy E. DEDONATO, Joaquim Gonçalo LOBO FERREIRA DA SILVA, Tyler R. CALDERONE, Charilaos PAPADOPOULOS
  • Publication number: 20220382376
    Abstract: A finger-mounted device may include finger-mounted units. The finger-mounted units may each have a body that serves as a support structure for components such as force sensors, accelerometers, and other sensors and for haptic output devices. The body may have sidewall portions coupled by a portion that rests adjacent to a user's fingernail. The body may be formed from deformable material such as metal or may be formed from adjustable structures such as sliding body portions that are coupled to each other using magnetic attraction, springs, or other structures. The body of each finger-mounted unit may have a U-shaped cross-sectional profile that leaves the finger pad of each finger exposed when the body is coupled to a fingertip of a user's finger. Control circuitry may gather finger press input, lateral finger movement input, and finger tap input using the sensors and may provide haptic output using the haptic output device.
    Type: Application
    Filed: August 11, 2022
    Publication date: December 1, 2022
    Inventors: Paul X. Wang, Alex J. Lehmann, Michael J. Rockwell, Michael Y. Cheung, Ray L. Chang, Hongcheng Sun, Ian M. Bullock, Kyle J. Nekimken, Madeleine S. Cordier, Seung Wook Kim, David H. Bloom, Scott G. Johnston