Patents by Inventor Julian K. Shutzberg
Julian K. Shutzberg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240103613Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.Type: ApplicationFiled: September 11, 2023Publication date: March 28, 2024Inventors: Vinay Chawda, Mehmet N. Agaoglu, Leah M. Gum, Paul A. Lacey, Julian K. Shutzberg, Tim H. Cornelissen, Alexander G. Birardino
-
Publication number: 20240103634Abstract: Techniques for mapping a user input motion includes detecting an input motion by a user, determining an origin for an input motion in a user-centric spherical coordinate system, determining an arc length for the input motion based on the determined origin, mapping the arc length of the input motion to a 2D plane of a user input component, and presenting a movement of a user input component on the 2D plane in accordance with the mapping.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Vinay Chawda, Chase B. Lortie, Daniel J. Brewer, Julian K. Shutzberg, Leah M. Gum, Yirong Tang, Alexander T. Wing
-
Publication number: 20240103705Abstract: Various implementations disclosed herein facilitate interactions with a user interface in 3D environment in which a user interface element is moved based on a user movement in a way that the user interface element appears to lag behind or follow a portion of the user (e.g., the user's fingertip). The user interface element may be moved in a way that it converges with and thus catches up to the portion of the user. Such convergence may be based on the speed of the movement of the portion of the user. No convergence may occur when the portion of the user is not moving or is moving below a threshold speed. When the portion of the user is moving (e.g., above a threshold speed), the user interface component may converge with the portion of the user and the rate of convergence may be increased with faster speeds.Type: ApplicationFiled: September 12, 2023Publication date: March 28, 2024Inventors: Vinay Chawda, Julian K. Shutzberg, Chase B. Lortie, Daniel J. Brewer, David J. Meyer, Leah M. Gum
-
Publication number: 20240103635Abstract: Suppressing a hand gesture upon detecting peripheral events on a peripheral device includes determining a first hand pose for a first hand a second hand pose for a second hand in response to a detected peripheral device peripheral event, determining, based on the first hand pose and the second hand pose, at least one hand of the first hand and the second hand in a peripheral use mode, detecting an input gesture from a hand of the at least one hand determined to be in the peripheral use mode, and rejecting the input gesture by a user input pipeline in accordance with the determination that the hand is in the peripheral use mode. The presence of a peripheral device is confirmed by activating a computer vision system in response to determining that a peripheral use condition is satisfied.Type: ApplicationFiled: September 22, 2023Publication date: March 28, 2024Inventors: Daniel J. Brewer, Ashwin Kumar Asoka Kumar Shenoi, Tian Qiu, Leah M. Gum, David J. Meyer, Julian K. Shutzberg, Yirong Tang
-
Publication number: 20240103618Abstract: Methods and apparatus for correcting the gaze direction and the origin (entrance pupil) in gaze tracking systems. During enrollment after an eye model is obtained, the pose of the eye when looking at a target prompt is determined. This information is used to estimate the true visual axis of the eye. The visual axis may then be used to correct the point of view (PoV) with respect to the display during use. If a clip-on lens is present, a corrected gaze axis may be calculated based on the known optical characteristics and pose of the clip-on lens. A clip-on corrected entrance pupil may then be estimated by firing two or more virtual rays through the clip-on lens to determine the intersection between the rays and the corrected gaze axis.Type: ApplicationFiled: September 19, 2023Publication date: March 28, 2024Applicant: Apple Inc.Inventors: Julia Benndorf, Qichao Fan, Julian K. Shutzberg, Paul A. Lacey, Hua Gao
-
Publication number: 20240094882Abstract: In some embodiments, a computer system facilitates user input for displaying a selection refinement user interface object in an object within a three-dimensional environment, wherein the selection refinement user interface object indicates a location in the object at which the computer system performs a selection operation in response to further user input. In some embodiments, a computer system facilitates user input for activating a selection refinement mode in the three-dimensional environment during which an interaction point for a selection operation is movable based on movement of a hand of the user of the computer system.Type: ApplicationFiled: September 21, 2023Publication date: March 21, 2024Inventors: Daniel J. BREWER, Julian K. SHUTZBERG, David J. MEYER, Chase B. LORTIE
-
Publication number: 20240094825Abstract: Aspects of the subject technology provide improved techniques for gesture recognition. Improved techniques may include detecting and/or classifying an interaction between the body part and another object in a scan of the body part, and then controlling recognition of a gesture based on the interaction. In an aspect, recognition parameters may be selected based on the interaction classification that disable recognition of one or more gestures while not disabling recognition of other gestures.Type: ApplicationFiled: September 15, 2023Publication date: March 21, 2024Inventors: Lailin CHEN, Ashwin Kumar ASOKA KUMAR SHENOI, Daniel J. BREWER, Eslam A. MOSTAFA, Itay BAR YOSEF, Julian K. SHUTZBERG, Leah M. GUM, Martin MELOUN, Minhaeng LEE, Victor BELYAEV
-
Publication number: 20230333665Abstract: Techniques for managing an engagement zone include tracking, by a system, a hand of a user and determining that a height of the hand of the user satisfies a first threshold height. In accordance with determining that the height of the hand of the user satisfies the first threshold height, the techniques also include initiating a UI engagement state, wherein the system monitors the user for user input during the UI engagement state, and determining user input into the system based on a user motion detected while the hand is tracked. The threshold height is associated with a boundary of a UI engagement zone and is modifiable based on user activity.Type: ApplicationFiled: April 19, 2023Publication date: October 19, 2023Inventors: Ashwin Kumar Asoka Kumar Shenoi, Julian K. Shutzberg, Leah M. Gum, Daniel J. Brewer, Chia-Ling Li
-
Patent number: 10871847Abstract: Disclosed are electronic devices and methods of their operation that use force sensors to detect user applied forces on an input surface and determine their locations on the input surface, using only force sensors. The locations may be determined using weighted averages of the positions of the force sensors and their values. The methods may compare dynamically updated baseline force values to received force sensor values to distinguish user applied forces from changes in the force sensor values caused other sources. After detection of a user applied force, the baseline force values are frozen, and the force sensor values used to find the location on the input surface where the user applied the force. The electronic device can operate according to a state space model, with a first state in which there is no user applied force, and a second state in which there is user applied force.Type: GrantFiled: September 26, 2018Date of Patent: December 22, 2020Assignee: Apple Inc.Inventors: Julian K. Shutzberg, Baboo V. Gowreesunker, Collin R. Petty
-
Patent number: 10866683Abstract: A device includes a housing defining part of an interior volume and an opening to the interior volume; a cover mounted to the housing to cover the opening and further define the interior volume; a display mounted within the interior volume and viewable through the cover; and a system in package (SiP) mounted within the interior volume. The SiP includes a self-capacitance sense pad adjacent a first surface of the SiP; a set of solder structures attached to a second surface of the SiP, the second surface opposite the first surface; and an IC coupled to the self-capacitance sense pad and configured to output, at one or more solder structures in the set of solder structures, a digital value related to a measured capacitance of the self-capacitance sense pad. The SiP is mounted within the interior volume with the first surface positioned closer to the cover than the second surface.Type: GrantFiled: August 23, 2019Date of Patent: December 15, 2020Assignee: Apple Inc.Inventors: Pavan O. Gupta, Andrew W. Joyce, Benedict Drevniok, Mo Li, David S. Graff, Albert Lin, Julian K. Shutzberg, Hojjat Seyed Mousavi
-
Publication number: 20200064952Abstract: A device includes a housing defining part of an interior volume and an opening to the interior volume; a cover mounted to the housing to cover the opening and further define the interior volume; a display mounted within the interior volume and viewable through the cover; and a system in package (SiP) mounted within the interior volume. The SiP includes a self-capacitance sense pad adjacent a first surface of the SiP; a set of solder structures attached to a second surface of the SiP, the second surface opposite the first surface; and an IC coupled to the self-capacitance sense pad and configured to output, at one or more solder structures in the set of solder structures, a digital value related to a measured capacitance of the self-capacitance sense pad. The SiP is mounted within the interior volume with the first surface positioned closer to the cover than the second surface.Type: ApplicationFiled: August 23, 2019Publication date: February 27, 2020Inventors: Pavan O. Gupta, Andrew W. Joyce, Benedict Drevniok, Mo Li, David S. Graff, Albert Lin, Julian K. Shutzberg, Hojjat Seyed Mousavi
-
Publication number: 20190102031Abstract: Disclosed are electronic devices and methods of their operation that use force sensors to detect user applied forces on an input surface and determine their locations on the input surface, using only force sensors. The locations may be determined using weighted averages of the positions of the force sensors and their values. The methods may compare dynamically updated baseline force values to received force sensor values to distinguish user applied forces from changes in the force sensor values caused other sources. After detection of a user applied force, the baseline force values are frozen, and the force sensor values used to find the location on the input surface where the user applied the force. The electronic device can operate according to a state space model, with a first state in which there is no user applied force, and a second state in which there is user applied force.Type: ApplicationFiled: September 26, 2018Publication date: April 4, 2019Inventors: Julian K. Shutzberg, Baboo V. Gowreesunker, Collin R. Petty