Patents by Inventor Mark A. EBBOLE
Mark A. EBBOLE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240402792Abstract: Various implementations disclosed herein include devices, systems, and methods that enable multi-mode interactions with elements in a three-dimensional (3D) environment based on cursor movement associated with tracking user hand motion. For example, a process may include presenting an extended reality (XR) environment comprising a virtual element and a cursor. The process may further include obtaining hand data corresponding to 3D movement of a hand in a 3D environment. The process may further include operating in first mode where the 3D motion of the hand is converted to two-dimensional (2D) motion and detecting a 3D user input criteria. In response to the 3D user input criteria a mode of operation is modified to a second mode where the 3D motion of the hand is maintained without conversion to the 2D motion.Type: ApplicationFiled: May 16, 2024Publication date: December 5, 2024Inventors: Jack H. Lawrence, Mark A. Ebbole, Evgenii Krivoruchko
-
Publication number: 20240402872Abstract: Some techniques are described herein for integrating a 2D framework with a 3D framework. Such techniques use a concept referred to as a hidden entity to link the two frameworks together. Other techniques are described herein for translating gestures from a first type to a second type in certain situations.Type: ApplicationFiled: March 29, 2024Publication date: December 5, 2024Inventors: Andrew P. RICHARDSON, Christian A. NILES, Collin R. RUSSELL, Abhinay ASHUTOSH, Mark A. EBBOLE
-
Publication number: 20240370155Abstract: Out-of-process hit-testing can provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies a control style for a UI window to be managed separately from the application, such as by a system process running outside of the application process. When a user input is received at a location corresponding to a part of the application UI that is separate from the UI window for which the control style has been specified, the user input may be redirected to the UI window for which the control style has been specified.Type: ApplicationFiled: July 16, 2024Publication date: November 7, 2024Inventors: Mark A. EBBOLE, Andrew T. FINKE, Abhinay ASHUTOSH
-
Publication number: 20240290052Abstract: Implementations of the subject technology provide virtual anchoring for extended reality (XR) display devices. A device may generate an XR environment that includes computer-generated (CG) content for display relative to various physical objects in a physical environment. In order to position the CG content, an XR application may request a physical anchor object to which the CG content can be anchored. In circumstances in which the physical anchor object is not available in the physical environment, a virtual anchor and/or a virtual anchor object corresponding to the physical anchor object can be provided to which the CG content can be anchored.Type: ApplicationFiled: May 7, 2024Publication date: August 29, 2024Inventors: Michael E. BUERLI, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE, Andrew P. RICHARDSON
-
Patent number: 12056344Abstract: Out-of-process hit-testing can provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies a control style for a UI window to be managed separately from the application, such as by a system process running outside of the application process. When a user input is received at a location corresponding to a part of the application UI that is separate from the UI window for which the control style has been specified, the user input may be redirected to the UI window for which the control style has been specified.Type: GrantFiled: June 30, 2023Date of Patent: August 6, 2024Assignee: Apple Inc.Inventors: Mark A. Ebbole, Andrew T. Finke, Abhinay Ashutosh
-
Publication number: 20240241615Abstract: Various implementations disclosed herein include devices, systems, and methods that transport a cursor between surfaces of objects within an XR environment. For example, an example process may include displaying a movement of a cursor across a first surface of a first object in a view of a three-dimensional (3D) environment. The process may further include determining that movement of the cursor approaches or intersects a boundary of the first surface at a first position. The process may further include determining a second position on a second surface of a second object in the 3D environment based on a path of the cursor. The process may further include moving the cursor from the first position to the second position.Type: ApplicationFiled: January 11, 2024Publication date: July 18, 2024Inventors: Jack H. LAWRENCE, Mark A. EBBOLE
-
Patent number: 12020379Abstract: Implementations of the subject technology provide virtual anchoring for extended reality (XR) display devices. A device may generate an XR environment that includes computer-generated (CG) content for display relative to various physical objects in a physical environment. In order to position the CG content, an XR application may request a physical anchor object to which the CG content can be anchored. In circumstances in which the physical anchor object is not available in the physical environment, a virtual anchor and/or a virtual anchor object corresponding to the physical anchor object can be provided to which the CG content can be anchored.Type: GrantFiled: February 24, 2021Date of Patent: June 25, 2024Assignee: Apple Inc.Inventors: Michael E. Buerli, Samuel L Iglesias, Tyler R. Calderone, Mark A Ebbole, Andrew P. Richardson
-
Publication number: 20240152245Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.Type: ApplicationFiled: September 21, 2023Publication date: May 9, 2024Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
-
Publication number: 20240069688Abstract: A head-mounted device is provided that includes displays configured to display an image and to simultaneously display a magnifying window that presents a magnified portion of the image. The magnifying window lies in a magnification plane that is fixed relative to a user's head. One or more processors in the head-mounted device can be used to perform a first ray cast operation to identify an input point where a detected user input intersects the magnifying window, to obtain a remapped point from the input point, to compute a directional vector based on the remapped point and a reference point associated with the user's head, to obtain a shifted point by shifting the remapped point from the magnification plane to another plane parallel to the magnification plane, and to perform a second ray cast operation using the shifted point and the directional vector.Type: ApplicationFiled: August 17, 2023Publication date: February 29, 2024Inventors: Daniel M Golden, John M Nefulda, Joaquim Goncalo Lobo Ferreira da Silva, Anuj Bhatnagar, Mark A Ebbole, Andrew A Haas, Seyedkoosha Mirhosseini, Colin D Munro
-
Publication number: 20240036699Abstract: While a view of an environment is visible via a display generation component of a computer system, the computer system detects a gaze input directed to a first location, corresponding to a first user interface element, in the environment. In response to detecting the gaze input: if a user's hand is in a predefined configuration during the gaze input, the computer system: provides, to the first user interface element, information about the gaze input; and then, in response to detecting the gaze input moving to a different, second location in the environment while the user's hand is maintained in the predefined configuration, provides, to a second user interface element that corresponds to the second location, information about the gaze input. If the user's hand is not in the predefined configuration during the gaze input, the computer system forgoes providing, to the first user interface element, information about the gaze input.Type: ApplicationFiled: July 25, 2023Publication date: February 1, 2024Inventors: Mark A. Ebbole, Leah M. Gum, Chia-Ling Li, Ashwin Kumar Asoka Kumar Shenoi
-
Publication number: 20240004538Abstract: Out-of-process hit-testing can provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies a control style for a UI window to be managed separately from the application, such as by a system process running outside of the application process. When a user input is received at a location corresponding to a part of the application UI that is separate from the UI window for which the control style has been specified, the user input may be redirected to the UI window for which the control style has been specified.Type: ApplicationFiled: June 30, 2023Publication date: January 4, 2024Inventors: Mark A. EBBOLE, Andrew T. FINKE, Abhinay ASHUTOSH
-
Publication number: 20230298267Abstract: Various implementations disclosed herein more accurately or efficiently determine to which of multiple potential virtual objects user input should be directed in a 3D graphical environment. In some implementations, this involves using a rule that accounts for the types of the virtual objects to which a particular event may correspond. For example, a direction of intent may be identified and a rule used to determine to which of multiple potential virtual objects to associate an event.Type: ApplicationFiled: December 17, 2022Publication date: September 21, 2023Inventors: Charilaos PAPADOPOULOS, Aaron M. BURNS, Alexis H. PALANGIE, Andrew P. RICHARDSON, Bruno M. SOMMER, Charles MAGAHERN, Joseph P. CERRA, Justin T. VOSS, Luis R. DELIZ CENTENO, Mark A. EBBOLE, Martin GARSTENAUER, Peter L. HAJAS, Samuel L. IGLESIAS
-
Publication number: 20210325960Abstract: Aspects of the subject technology relate to gaze-based control of an electronic device. The gaze-based control can include enabling an option to provide user authorization when it is determined that the user has viewed and/or read text associated with a request for the user authorization. The gaze-based control can also include modifying a user interface or a user interface element based on user views and/or reads. The gaze-based control can be based on determining whether a user has viewed and/or read an electronic document and/or a physical document.Type: ApplicationFiled: February 26, 2021Publication date: October 21, 2021Inventors: Samuel L. IGLESIAS, Mark A. EBBOLE, Andrew P. RICHARDSON, Tyler R. CALDERONE, Michael E. BUERLI, Devin W. CHALMERS
-
Publication number: 20210327146Abstract: Implementations of the subject technology provide virtual anchoring for extended reality (XR) display devices. A device may generate an XR environment that includes computer-generated (CG) content for display relative to various physical objects in a physical environment. In order to position the CG content, an XR application may request a physical anchor object to which the CG content can be anchored. In circumstances in which the physical anchor object is not available in the physical environment, a virtual anchor and/or a virtual anchor object corresponding to the physical anchor object can be provided to which the CG content can be anchored.Type: ApplicationFiled: February 24, 2021Publication date: October 21, 2021Inventors: Michael E. BUERLI, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE, Andrew P. RICHARDSON
-
Publication number: 20210326094Abstract: Implementations of the subject technology provide continuous transfer of content editing and/or control between various devices in an extended reality system. The extended reality system includes at least one device that is capable of determining the locations of other devices in the system. This device can manage continuous transfer of control between other devices in the system responsive to three-dimensional location-based user inputs, and/or can manage continuous transfer of control between one or more of the other devices and the device itself.Type: ApplicationFiled: February 26, 2021Publication date: October 21, 2021Inventors: Michael E. BUERLI, Andrew P. RICHARDSON, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE