Patents by Inventor James T. Turner
James T. Turner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250028546Abstract: Remote user interface (UI) rendering effects provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies remote UI effects to be managed and rendered on UI elements separately from the application, such as by an effects component running outside of the application's operating system process. When user input indicates a preliminary interaction with a UI element, the remote UI effect can be rendered without the application's knowledge of the preliminary interaction, hence preserving a user's privacy from the application of preliminary UI interactions.Type: ApplicationFiled: October 2, 2024Publication date: January 23, 2025Inventors: Stephen E. PINTO, Andrew T. FINKE, Abhinay ASHUTOSH, Cedric BRAY, Peter L. HAJAS, Andrew P. RICHARDSON, Yidi ZHU, James T. TURNER
-
Publication number: 20240404228Abstract: Some techniques are described herein for managing computer-generated environments, including methods for managing the size of virtual objects and managing an experience.Type: ApplicationFiled: March 28, 2024Publication date: December 5, 2024Inventors: Owen MONSMA, Peter L. HAJAS, James T. TURNER
-
Publication number: 20240402891Abstract: Some techniques are described herein for managing requests for placement of a user interface object within an environment. Some techniques are described herein for requesting placement of a user interface object within an environment.Type: ApplicationFiled: March 27, 2024Publication date: December 5, 2024Inventors: Florentin BEKIER, Raffael HANNEMANN, Peter L. HAJAS, James T. TURNER
-
Patent number: 12131170Abstract: Remote user interface (UI) rendering effects provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies remote UI effects to be managed and rendered on UI elements separately from the application, such as by an effects component running outside of the application's operating system process. When user input indicates a preliminary interaction with a UI element, the remote UI effect can be rendered without the application's knowledge of the preliminary interaction, hence preserving a user's privacy from the application of preliminary UI interactions.Type: GrantFiled: June 23, 2023Date of Patent: October 29, 2024Assignee: Apple Inc.Inventors: Andrew P. Richardson, Abhinay Ashutosh, James T. Turner, Przemyslaw M. Iwanowski, Yidi Zhu
-
Publication number: 20240221301Abstract: Various implementations disclosed herein provide augmentations in extended reality (XR) using sensor data from a user worn device. The sensor data may be used understand that a user's state is associated with providing user assistance, e.g., a user's appearance or behavior or an understanding of the environment may be used to recognize a need or desire for user assistance. The augmentations may assist the user by enhancing or supplementing the user's abilities, e.g., providing guidance or other information about an environment to disabled/impaired person.Type: ApplicationFiled: December 28, 2023Publication date: July 4, 2024Inventors: Aaron M. Burns, Benjamin R. Blachnitzky, Laura Sugden, Charilaos Papadopoulos, James T. Turner
-
Publication number: 20240220069Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: ApplicationFiled: March 18, 2024Publication date: July 4, 2024Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20240211053Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: ApplicationFiled: March 6, 2024Publication date: June 27, 2024Inventors: James T. TURNER, Peter L. HAJAS
-
Patent number: 11972088Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: GrantFiled: March 22, 2023Date of Patent: April 30, 2024Assignee: Apple Inc.Inventors: James T. Turner, Peter L Hajas
-
Patent number: 11947731Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: GrantFiled: March 22, 2023Date of Patent: April 2, 2024Assignee: Apple Inc.Inventors: James T. Turner, Peter L. Hajas
-
Publication number: 20240061547Abstract: While a view of a three-dimensional environment is visible via a display generation component of a computer system, the computer system receives, from a user, one or more first user inputs corresponding to selection of a respective direction in the three-dimensional environment relative to a reference point associated with the user, and displays a ray extending from the reference point in the respective direction. While displaying the ray, the system displays a selection cursor moving along the ray independently of user input. When the selection cursor is at a respective position along the ray, the system receives one or more second user inputs corresponding to a request to stop the movement of the selection cursor along the ray and, in response, sets a target location for a next user interaction to a location in the three-dimensional environment that corresponds to the respective position of the selection cursor along the ray.Type: ApplicationFiled: August 14, 2023Publication date: February 22, 2024Inventors: Christopher B. Fleizach, James T. Turner, Daniel M. Golden, Kristi E.S. Bauerly, John M. Nefulda
-
Publication number: 20240004678Abstract: Remote user interface (UI) rendering effects provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies remote UI effects to be managed and rendered on UI elements separately from the application, such as by an effects component running outside of the application's operating system process. When user input indicates a preliminary interaction with a UI element, the remote UI effect can be rendered without the application's knowledge of the preliminary interaction, hence preserving a user's privacy from the application of preliminary UI interactions.Type: ApplicationFiled: June 23, 2023Publication date: January 4, 2024Inventors: Andrew P. RICHARDSON, Abhinay ASHUTOSH, James T. TURNER, Przemyslaw M. IWANOWSKI, Yidi ZHU
-
Publication number: 20230229281Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: ApplicationFiled: March 22, 2023Publication date: July 20, 2023Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20230229241Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: ApplicationFiled: March 22, 2023Publication date: July 20, 2023Inventors: James T. TURNER, Peter L. HAJAS
-
Patent number: 10268647Abstract: Systems and methods are disclosed for authoring, deploying, and executing layer stack images for applications directed to a plurality of target devices. Resources to implement the layer stack images are compiled into an asset catalog database for each image in each layer stack image for each target device. Derivative resource products, such as a flattened version of the layer stack images and a “blurred” version of layer stack images can be generated and stored in the asset catalog at compile and build time. Three-dimensional effects implemented using the layer stack images can be implemented using an application programming interface that accepts legacy two dimensional images can be used to receive the layer stack images. An platform framework implements logic that detects the type of image requested via the API is a layer stack image or a conventional flat image. Third party layer stack images can be received and displayed at run-time or compile time.Type: GrantFiled: September 30, 2015Date of Patent: April 23, 2019Assignee: Apple Inc.Inventors: Patrick O. Heynen, Jonathan J. Hess, Blake R. Seely, James T. Turner
-
Publication number: 20160358356Abstract: Systems and methods are disclosed for authoring, deploying, and executing layer stack images for applications directed to a plurality of target devices. Resources to implement the layer stack images are compiled into an asset catalog database for each image in each layer stack image for each target device. Derivative resource products, such as a flattened version of the layer stack images and a “blurred” version of layer stack images can be generated and stored in the asset catalog at compile and build time. Three-dimensional effects implemented using the layer stack images can be implemented using an application programming interface that accepts legacy two dimensional images can be used to receive the layer stack images. An platform framework implements logic that detects the type of image requested via the API is a layer stack image or a conventional flat image. Third party layer stack images can be received and displayed at run-time or compile time.Type: ApplicationFiled: September 30, 2015Publication date: December 8, 2016Inventors: Patrick O. HEYNEN, Jonathan J. HESS, Blake R. SEELY, James T. TURNER
-
Patent number: 4265252Abstract: An implantable transensor device containing a passive RF resonant circuit having a natural frequency influenced by the pressure of the sensor's environment in a body cavity of a living entity. The circuit of the transensor includes an inductor and a capacitor, at least one of which varies in value in direct relation to variation of environmental pressure to change the resonant frequency of the circuit. The circuit can be externally interrogated to determine the resonant frequency thereof at any point in time by the imposition thereon of swept frequency electromagnetic radiation provided by a monitoring device which determines when some of the radiation is absorbed as a result of the frequency of the radiation being the same as the resonant frequency of the transensor circuit. An imposed relationship exists between the sensed environmental pressure, and the reactance of the reactive components of the circuit.Type: GrantFiled: April 19, 1978Date of Patent: May 5, 1981Assignee: The Johns Hopkins UniversityInventors: John G. Chubbuck, James T. Turner