Patents by Inventor Kristi E. Bauerly
Kristi E. Bauerly has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240103704Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Zoey C. TAYLOR
-
Publication number: 20240103701Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Lorena S. PAZMINO
-
Publication number: 20240103716Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Israel PASTRANA VICENTE, Evgenii KRIVORUCHKO, Kristi E. BAUERLY
-
Publication number: 20240103643Abstract: A computer system can include an input device having a housing defining an internal volume. The housing can include a grip portion and a base portion defining an aperture. The computer system can also include a tilt sensor disposed in the internal volume, a position sensor disposed at the aperture, and a processor. The processor can be electrically coupled to the position sensor, the tilt sensor, and a memory component storing electronic instructions that, when executed by the processor, cause the processor to receive a first input from the tilt sensor, receive a second input from the position sensor, determine, based on the first and second inputs, if the base is in contact with a support surface and an angle of the base relative to the support surface. The processor can also output a signal based on the angle if the base is in contact with the support surface.Type: ApplicationFiled: September 21, 2023Publication date: March 28, 2024Inventors: Megan M. Sapp, Brian T. Gleeson, Steven J. Taylor, David H. Bloom, Maio He, Seung Wook Kim, Evangelos Christodoulou, Kristi E. Bauerly, Geng Luo, Bart K. Andre
-
Publication number: 20240103642Abstract: A three-dimensional control system includes an input device, a computing device, and a tracking assembly. The input device can include an input sensor, an inertial measurement unit sensor, and an ultrasonic speaker. The tracking assembly can include a plurality of ultrasonic microphones and an inertial measurement unit disposed on or with the computing device. The plurality of ultrasonic microphones can include three microphones in a first plane and at least one other ultrasonic microphone disposed out of the first plane. The ultrasonic microphones can be configured to detect ultrasonic waves output by the speaker of the input device and the computing device can triangulate the position of the input device relative to the computing device in space.Type: ApplicationFiled: September 21, 2023Publication date: March 28, 2024Inventors: Bart K. Andre, Brian T. Gleeson, Kristi E. Bauerly, William D. Lindmeier, Matthew J. Sundstrom, Geng Luo, Seung Wook Kim, Evangelos Christodoulou, Megan M. Sapp, Kainoa Kwon-Perez, John B. Morrell
-
Publication number: 20240103645Abstract: A computer input system includes a mouse including a housing having an interior surface defining an internal volume and a sensor assembly disposed in the internal volume. A processor is electrically coupled to the sensor assembly and a memory component having electronic instructions stored thereon that, when executed by the processor, causes the processor to determine an orientation of the mouse relative to a hand based on a touch input from the hand detected by the sensor assembly. The mouse can also have a circular array of touch sensors or lights that detect hand position and provide orientation information to the user.Type: ApplicationFiled: September 21, 2023Publication date: March 28, 2024Inventors: Bart K. Andre, Brian T. Gleeson, Kristi E. Bauerly, William D. Lindmeier, Matthew J. Sundstrom, Geng Luo, Seung Wook Kim, Evangelos Christodoulou, Megan M. Sapp, Kainoa Kwon-Perez, David H. Bloom, Steven J. Taylor, John B. Morrell, Maio He, Hamza Kashif
-
Publication number: 20240103656Abstract: An input device, such as a mouse, can include a housing defining an exterior grip portion and an internal volume, a sensor assembly disposed in the internal volume, and an emitter electrically coupled to the sensor assembly. In response to the sensor assembly detecting a first touch input on the housing, the emitter sends a first signal including information regarding an angular position of the grip portion. In response to the sensor assembly detecting a second touch input on the housing, the emitter sends a second signal including information regarding a direction of a force exerted on the housing from the second touch input.Type: ApplicationFiled: September 21, 2023Publication date: March 28, 2024Inventors: Bart K. Andre, Brian T. Gleeson, Kristi E. Bauerly, William D. Lindmeier, Matthew J. Sundstrom, Geng Luo, Seung Wook Kim, Evangelos Christodoulou, Megan M. Sapp, Kainoa Kwon-Perez, David H. Bloom, Steven J. Taylor
-
Publication number: 20240103803Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN, Jose Antonio CHECA OLORIZ, Jay MOON, Pedro MARI, Lorena S. PAZMINO
-
Publication number: 20240104859Abstract: The present disclosure generally relates to managing live communication sessions. A computer system optionally displays an option to invite the respective user to join the ongoing communication session. A computer system optionally one or more options to modify an appearance of an avatar representing the user of the computer system. A computer system optionally transitions a communication session from a spatial communication session to a non-spatial communication session. A computer system optionally displays information about a participant in a communication session.Type: ApplicationFiled: September 12, 2023Publication date: March 28, 2024Inventors: Jesse CHAND, Kristi E. BAUERLY, Shih-Sang CHIU, Jonathan R. DASCOLA, Amy E. DEDONATO, Karen EL ASMAR, Wesley M. HOLDER, Stephen O. LEMAY, Lorena S. PAZMINO, Jason D. RICKWALD, Giancarlo YERKES
-
Publication number: 20240094832Abstract: An input device can include a housing defining an internal volume and a lower portion, the lower portion defining an aperture, an input sensor disposed in the internal volume, and a haptic assembly disposed in the internal volume. The haptic assembly can include an actuator and a foot coupled to the actuator and aligned with the aperture. The actuator can be configured to selectively extend the foot through the aperture to vary a sliding resistance of the input device on a support surface.Type: ApplicationFiled: September 21, 2023Publication date: March 21, 2024Inventors: Bart K. Andre, Brian T. Gleeson, Kristi E. Bauerly, William D. Lindmeier, Matthew J. Sundstrom, Geng Luo, Seung Wook Kim, Evangelos Christodoulou, Megan M. Sapp, Kainoa Kwon-Perez, John B. Morrell, John S. Camp
-
Publication number: 20240077937Abstract: The present disclosure generally relates to techniques and user interfaces for controlling and displaying representations of user in environments, such as during a live communication session and/or a live collaboration session.Type: ApplicationFiled: September 1, 2023Publication date: March 7, 2024Inventors: Jason D. RICKWALD, Andrew R. BACON, Kristi E. BAUERLY, Rupert BURTON, Jordan A. CAZAMIAS, Tong CHEN, Shih-Sang CHIU, Jonathan PERRON, Giancarlo YERKES
-
Publication number: 20230384860Abstract: In some examples, a computer system provides non-visual feedback based on a determination that information about a user has been captured. In some examples, a computer system displays different portions of a representation of a user as the user and/or the computer system move relative to one another. In some examples, a computer system displays different three-dimensional content associated with different steps of an enrollment process. In some examples, a computer system displays visual elements at different simulated depths that move with simulated parallax to facilitate alignment of the user and the computer system. In some examples, a computer system prompts a user to make facial expressions and displays a progress bar indicating an amount of progress toward making the facial expressions. In some examples, a computer system adjusts dynamic audio output to indicate an amount of progress toward completing a step of an enrollment process.Type: ApplicationFiled: April 6, 2023Publication date: November 30, 2023Inventors: Amy E. DEDONATO, Kristi E. BAUERLY, Wesley M. HOLDER, Seung Wook KIM, Jay MOON, Danielle M. PRICE, Jason D. RICKWALD, William A. SORRENTINO, III, Alfred B. HUERGO WAGNER, Giancarlo YERKES
-
Publication number: 20230375843Abstract: A facial interface for a head-mounted display, which is to be worn on a head of a user, includes an upper portion and a lower portion. The upper portion engages an upper facial region above eyes of the user. The lower portion that engages a lower facial region below the eyes of the user. The lower portion has a lower shear compliance in that is greater than an upper shear compliance of the upper portion.Type: ApplicationFiled: August 2, 2023Publication date: November 23, 2023Inventors: Dustin A. Hatfield, Kristi E. Bauerly
-
Publication number: 20230350214Abstract: A head-mounted display includes a display unit and a facial interface. The display unit displays graphical content to the user. The facial interface is removably coupleable to the display unit and engages a facial engagement region of a face of the user whereby the display unit is supported on the face of the user. The facial interface includes a physiological sensor for sensing a physiological condition of the user in the facial engagement region.Type: ApplicationFiled: July 7, 2023Publication date: November 2, 2023Inventors: Daniel M. Strongwater, Kristi E. Bauerly, Dustin A. Hatfield
-
Publication number: 20230333595Abstract: A head-mounted display includes a display unit and a facial interface. The display unit includes a display for displaying graphical content to a user. The facial interface is coupled to the display unit and configured to engage a face of the user to support the display unit thereon. The facial interface includes an upper portion that engages a forehead of the user and side portions that engage temple regions of the user. The facial interface converts forward force applied to the upper portion by the forehead into inward force applied by side portions to the temple regions.Type: ApplicationFiled: June 27, 2023Publication date: October 19, 2023Applicant: APPLE INC.Inventors: Dustin A. Hatfield, Daniel M. Strongwater, Kristi E. Bauerly
-
Patent number: 11782480Abstract: A head-mounted display includes a display unit, a head support, and a balancing mass. The display unit displays graphical content to a user. The head support is coupled to the display to support the display unit on a head of the user for displaying the graphical content thereto. The balancing mass is movable relative to the display unit to offset changes in torque induced by the display unit when tilting the head-mounted display.Type: GrantFiled: January 14, 2020Date of Patent: October 10, 2023Assignee: APPLE INC.Inventors: Dustin A. Hatfield, Daniel M. Strongwater, Kristi E. Bauerly
-
Patent number: 11762206Abstract: A facial interface for a head-mounted display, which is to be worn on a head of a user, includes an upper portion and a lower portion. The upper portion engages an upper facial region above eyes of the user. The lower portion that engages a lower facial region below the eyes of the user. The lower portion has a lower shear compliance in that is greater than an upper shear compliance of the upper portion.Type: GrantFiled: July 15, 2020Date of Patent: September 19, 2023Assignee: APPLE INC.Inventors: Dustin A. Hatfield, Kristi E. Bauerly
-
Patent number: 11762421Abstract: A head-mounted display includes a display unit and a facial interface. The display unit includes a display for displaying graphical content to a user. The facial interface is coupled to the display unit and configured to engage a face of the user to support the display unit thereon. The facial interface includes an upper portion that engages a forehead of the user and side portions that engage temple regions of the user. The facial interface converts forward force applied to the upper portion by the forehead into inward force applied by side portions to the temple regions.Type: GrantFiled: January 14, 2020Date of Patent: September 19, 2023Assignee: APPLE INC.Inventors: Dustin A. Hatfield, Daniel M. Strongwater, Kristi E. Bauerly
-
Patent number: 11740475Abstract: A head-mounted display includes a display unit and a facial interface. The display unit displays graphical content to the user. The facial interface is removably coupleable to the display unit and engages a facial engagement region of a face of the user whereby the display unit is supported on the face of the user. The facial interface includes a physiological sensor for sensing a physiological condition of the user in the facial engagement region.Type: GrantFiled: January 14, 2020Date of Patent: August 29, 2023Assignee: APPLE INC.Inventors: Daniel M. Strongwater, Kristi E. Bauerly, Dustin A. Hatfield
-
Publication number: 20230259265Abstract: In some embodiments, a computer system scrolls scrollable content in response to a variety of user inputs. In some embodiments, a computer system enters text into a text entry field in response to voice inputs. In some embodiments, a computer system facilitates interactions with a soft keyboard. In some embodiments, a computer system facilitates interactions with a cursor. In some embodiments, a computer system facilitates deletion of text. In some embodiments, a computer system facilitates interactions with hardware input devices.Type: ApplicationFiled: January 3, 2023Publication date: August 17, 2023Inventors: Evgenii KRIVORUCHKO, Kristi E. BAUERLY, Stephen O. LEMAY, Israel PASTRANA VICENTE