Patents by Inventor Kristi E. S. BAUERLY
Kristi E. S. BAUERLY has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11954242Abstract: A computer system presents first computer-generated content. While presenting the first computer-generated content, the computer system detects first movement of a first user in a physical environment, and in response: in accordance with a determination that the first movement changes a spatial relationship between the first user and a second user in the physical environment from a first spatial relationship to a second spatial relationship and that the change in spatial relationship meets first criteria, the computer system changes one or more output properties of the first computer-generated content; and in accordance with the determination that the first movement changes the spatial relationship from the first spatial relationship to the second spatial relationship and that the change in spatial relationship does not meet the first criteria, the computer system presents the first computer-generated content without changing the one or more output properties of the first computer-generated content.Type: GrantFiled: December 28, 2021Date of Patent: April 9, 2024Assignee: APPLE INC.Inventors: Jonathan R. Dascola, Israel Pastrana Vicente, Peter D. Anton, Stephen O. Lemay, William A. Sorrentino, III, Kristi E. S. Bauerly, Philipp Rockel, Dorian D. Dargan
-
Publication number: 20240103712Abstract: A computer system detects a gaze input directed to a region in an environment and, while detecting the gaze input, detects a touch input. In response, the computer system displays a focus indicator at a location corresponding to the region. The computer system detects a continuation of the touch input that includes movement of the touch input along an input surface while being maintained on the input surface. In response, the computer system moves the focus indicator in accordance with the movement of the touch input: within a user interface of an application, if the movement corresponds to a request to move the focus indicator within the user interface; and within the user interface without moving the focus indicator outside of the boundary of the user interface, if the movement corresponds to a request to move the focus indicator outside of a boundary of the user interface.Type: ApplicationFiled: September 19, 2023Publication date: March 28, 2024Inventors: Jonathan Ravasz, Israel Pastrana Vicente, Stephen O. Lemay, Kristi E.S. Bauerly, Zoey C. Taylor
-
Publication number: 20240103680Abstract: A computer system displays a user interface that includes a first region and a second region separated by a third region and displays a focus indicator within the first region. In response to detecting an input to move the focus indicator relative to the user interface, the input being associated with movement toward the second region, if the input meets respective criteria based on the movement associated with the input, the computer system moves the focus indicator from the first region to the second region in accordance with the movement associated with the input without displaying the focus indicator in the third region; and, if the input does not meet the respective criteria, the computer system changes an appearance of the focus indicator in accordance with the movement associated with the input while continuing to display at least a portion of the focus indicator within the first region.Type: ApplicationFiled: September 19, 2023Publication date: March 28, 2024Inventors: Jonathan Ravasz, Israel Pastrana Vicente, Stephen O. Lemay, Kristi E.S. Bauerly, Zoey C. Taylor
-
Patent number: 11934569Abstract: A computer system displays a first and second user interface object in a three-dimensional environment. The first and second user interface objects have a first and second spatial relationship to a first and second anchor position corresponding to a location of a user's hand in a physical environment, respectively. While displaying the first and second user interface objects in the three-dimensional environment, the computer system detects movement of the user's hand in the physical environment, corresponding to a translational movement and a rotational movement of the user's hand relative to a viewpoint, and in response, translates the first and second user interface objects relative to the viewpoint in accordance with the translational movement of the user's hand, and rotates the first user interface object relative to the viewpoint in accordance with the rotational movement of the user's hand without rotating the second user interface object.Type: GrantFiled: September 19, 2022Date of Patent: March 19, 2024Assignee: APPLE INC.Inventors: Israel Pastrana Vicente, Jonathan R. Dascola, Christopher D. McKenzie, Jesse Chand, Stephen O. Lemay, Kristi E. S. Bauerly, Zoey C. Taylor
-
Publication number: 20240061547Abstract: While a view of a three-dimensional environment is visible via a display generation component of a computer system, the computer system receives, from a user, one or more first user inputs corresponding to selection of a respective direction in the three-dimensional environment relative to a reference point associated with the user, and displays a ray extending from the reference point in the respective direction. While displaying the ray, the system displays a selection cursor moving along the ray independently of user input. When the selection cursor is at a respective position along the ray, the system receives one or more second user inputs corresponding to a request to stop the movement of the selection cursor along the ray and, in response, sets a target location for a next user interaction to a location in the three-dimensional environment that corresponds to the respective position of the selection cursor along the ray.Type: ApplicationFiled: August 14, 2023Publication date: February 22, 2024Inventors: Christopher B. Fleizach, James T. Turner, Daniel M. Golden, Kristi E.S. Bauerly, John M. Nefulda
-
Patent number: 11875013Abstract: A computer system detects a wrist. In accordance with a determination that first criteria that require an inner side of the wrist facing toward a viewpoint are met, the computer system displays a first user interface object including a plurality of representations of different applications at a first position corresponding to a first location on the wrist. While displaying the first user interface object, the computer system detects that the wrist's position or orientation has changed to satisfying second criteria that requires an outer side of the wrist facing toward the viewpoint. In response, the computer system switches from displaying the first user interface object at the first position to displaying a second user interface object including a plurality of controls for controlling functions at a second position corresponding to a location on a back of a hand attached to the wrist.Type: GrantFiled: December 10, 2020Date of Patent: January 16, 2024Assignee: APPLE INC.Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Robert T. Tilton, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E. S. Bauerly, Giancarlo Yerkes, Peter D. Anton
-
Publication number: 20230418432Abstract: An electronic device, while displaying a three-dimensional environment, including one or more virtual objects, detects gaze of a user directed toward a first virtual object in the three-dimensional environment. The gaze meets first criteria and the first virtual object is responsive to at least one gesture input. In response to detecting the gaze that meets the first criteria, in accordance with a determination that a hand is in a predefined ready state for providing gesture inputs, the electronic device displays an indication of one or more interaction options available for the first virtual object in the three-dimensional environment; and in accordance with a determination the hand is not in the predefined ready state for providing gesture inputs, the electronic device forgoes displaying the indication of one or more interaction options available for the first virtual object.Type: ApplicationFiled: September 11, 2023Publication date: December 28, 2023Inventors: Jeffrey M. Faulkner, Israel Pastrana Vicente, Philipp Rockel, Robert T. Tilton, Stephen O. Lemay, William A. Sorrentino, III, Kristi E.S. Bauerly, Peter D. Anton
-
Patent number: 11810244Abstract: A computer system displays a first view of a three-dimensional environment, including a first user interface object, while a first user is at a first location in a first physical environment. A respective position of the first user interface object in the three-dimensional environment corresponds to a respective location of the first object in a second physical environment. The computer system detects movement of the first user in the first physical environment or movement of the first object in the second physical environment, and in response, displays a second view corresponding to a second viewpoint, and the first user interface object in the second view. The first user interface object is displayed at a first or second display position, in accordance with a determination that the respective position of the first user interface object is more or less than a threshold distance from the second viewpoint, respectively.Type: GrantFiled: December 29, 2022Date of Patent: November 7, 2023Assignee: APPLE INC.Inventors: Philipp Rockel, Nicholas W. Henderson, Kristi E. S. Bauerly
-
Patent number: 11733769Abstract: In some embodiments, a computer system receives data representing a pose of at least a first portion of a user and causes presentation of an avatar that includes a respective avatar feature corresponding to the first portion of the user and presented having a variable display characteristic that is indicative of a certainty of the pose of the first portion of the user. In some embodiments, a computer system receives data indicating current activity of one or more users is activity of a first type and, in response, updates a representation of a user having a first appearance based on a first appearance template. The system receives second data indicating current activity of the one or more users and, in response, updates the appearance of the representation of the first user based on the current activity of the one or more users using the first or a second appearance template.Type: GrantFiled: June 2, 2021Date of Patent: August 22, 2023Assignee: Apple Inc.Inventors: Gary Ian Butcher, Dorian D. Dargan, Nicolas Scapel, Rupert Burton, Nicholas W. Henderson, Jason Rickwald, Giancarlo Yerkes, Kristi E. S. Bauerly
-
Publication number: 20230186578Abstract: request to present first computer-generated content including first visual content and first audio content corresponding to the first visual content, displays the first visual content within the first portion of the three-dimensional environment and outputting the first audio content using a first audio output mode if the request is to present the first computer-generated content with a first level of immersion, and displays the first visual content within the second portion of the three-dimensional environment and outputting the first audio content using a second audio output mode if the request is to present the first computer-generated content with a second level of immersion greater than the first level of immersion, wherein using the second audio output mode instead of the first audio output mode changes a level of immersion of the first audio content.Type: ApplicationFiled: February 8, 2023Publication date: June 15, 2023Inventors: Jeffrey M. Faulkner, Stephen O. Lemay, William A. Sorrentino, III, Jonathan Ive, Kristi E.S. Bauerly
-
Publication number: 20230147148Abstract: A computer system displays a first view of a three-dimensional environment, including a first user interface object, while a first user is at a first location in a first physical environment. A respective position of the first user interface object in the three-dimensional environment corresponds to a respective location of the first object in a second physical environment. The computer system detects movement of the first user in the first physical environment or movement of the first object in the second physical environment, and in response, displays a second view corresponding to a second viewpoint, and the first user interface object in the second view. The first user interface object is displayed at a first or second display position, in accordance with a determination that the respective position of the first user interface object is more or less than a threshold distance from the second viewpoint, respectively.Type: ApplicationFiled: December 29, 2022Publication date: May 11, 2023Inventors: Philipp Rockel, Nicholas W. Henderson, Kristi E.S. Bauerly
-
Publication number: 20230100610Abstract: A computer system displays a first and second user interface object in a three-dimensional environment. The first and second user interface objects have a first and second spatial relationship to a first and second anchor position corresponding to a location of a user's hand in a physical environment, respectively. While displaying the first and second user interface objects in the three-dimensional environment, the computer system detects movement of the user's hand in the physical environment, corresponding to a translational movement and a rotational movement of the user's hand relative to a viewpoint, and in response, translates the first and second user interface objects relative to the viewpoint in accordance with the translational movement of the user's hand, and rotates the first user interface object relative to the viewpoint in accordance with the rotational movement of the user's hand without rotating the second user interface object.Type: ApplicationFiled: September 19, 2022Publication date: March 30, 2023Inventors: Israel Pastrana Vicente, Jonathan R. Dascola, Christopher D. McKenzie, Jesse Chand, Stephen O. Lemay, Kristi E.S. Bauerly, Dorian D. Dargan
-
Patent number: 11615596Abstract: A computer system, while displaying a view of a computer-generated environment, detects movement of a physical object, and in response: in accordance with a determination that a user is within a threshold distance of a first portion of the physical object and that the physical object meets preset criteria, the computer system changes an appearance of virtual content displayed at a position corresponding to a current location of the physical object's first portion, without changing an appearance of virtual content displayed at a position corresponding to the physical object's second portion; and in accordance with a determination that the user is within the threshold distance and that the physical object does not meet the preset criteria, the computer system changes an appearance of virtual content displayed at a position corresponding to a current location of the physical object's first portion.Type: GrantFiled: September 23, 2021Date of Patent: March 28, 2023Assignee: APPLE INC.Inventors: Jeffrey M. Faulkner, Stephen O. Lemay, William A. Sorrentino, III, Jonathan Ive, Kristi E. S. Bauerly
-
Publication number: 20220214743Abstract: A computer system presents first computer-generated content. While presenting the first computer-generated content, the computer system detects first movement of a first user in a physical environment, and in response: in accordance with a determination that the first movement changes a spatial relationship between the first user and a second user in the physical environment from a first spatial relationship to a second spatial relationship and that the change in spatial relationship meets first criteria, the computer system changes one or more output properties of the first computer-generated content; and in accordance with the determination that the first movement changes the spatial relationship from the first spatial relationship to the second spatial relationship and that the change in spatial relationship does not meet the first criteria, the computer system presents the first computer-generated content without changing the one or more output properties of the first computer-generated content.Type: ApplicationFiled: December 28, 2021Publication date: July 7, 2022Inventors: Jonathan R. Dascola, Israel Pastrana Vicente, Peter D. Anton, Stephen O. Lemay, William A. Sorrentino, III, Pol Pla I Conesa, Kristi E.S. Bauerly
-
Patent number: 11340756Abstract: While displaying a three-dimensional environment, a computer system detects a hand at a first position that corresponds to a portion of the three-dimensional environment. In response to detecting the hand at the first position: in accordance with a determination that the hand is being held in a first predefined configuration, the computer system displays a visual indication of a first operation context for gesture input using hand gestures in the three-dimensional environment; and in accordance with a determination that the hand is not being held in the first predefined configuration, the computer system forgoes display of the visual indication.Type: GrantFiled: September 23, 2020Date of Patent: May 24, 2022Assignee: APPLE INC.Inventors: Jeffrey M. Faulkner, Israel Pastrana Vicente, Pol Pla I. Conesa, Stephen O. Lemay, Robert T. Tilton, William A. Sorrentino, III, Kristi E. S. Bauerly
-
Patent number: 9900518Abstract: The present disclosure generally relates to multifunction physical buttons. An electronic device detects activation of a physical button. The device determines whether a set of one or more criteria are met. If the set of one or more criteria are met, the device captures an image using an image sensor. If the set of one or more criteria are not met, the device turns off the display of the device. Thus, the same physical button can perform different functions based on whether the set of one or more criteria are met.Type: GrantFiled: September 29, 2015Date of Patent: February 20, 2018Assignee: APPLE INC.Inventors: Kenneth Kocienda, Imran Chaudhri, Daniel Max Strongwater, Kristi E. S. Bauerly, Roberto G. Yepez, Justin S. Titi
-
Publication number: 20160373631Abstract: The present disclosure generally relates to multifunction physical buttons. An electronic device detects activation of a physical button. The device determines whether a set of one or more criteria are met. If the set of one or more criteria are met, the device captures an image using an image sensor. If the set of one or more criteria are not met, the device turns off the display of the device. Thus, the same physical button can perform different functions based on whether the set of one or more criteria are met.Type: ApplicationFiled: September 29, 2015Publication date: December 22, 2016Inventors: Kenneth KOCIENDA, Imran CHAUDHRI, Daniel Max STRONGWATER, Kristi E. S. BAUERLY, Roberto G. YEPEZ, Justin S. TITI