Patents by Inventor Peter L. Hajas
Peter L. Hajas has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250028546Abstract: Remote user interface (UI) rendering effects provide increased privacy and efficiency in computer user input systems. In an aspect, an application specifies remote UI effects to be managed and rendered on UI elements separately from the application, such as by an effects component running outside of the application's operating system process. When user input indicates a preliminary interaction with a UI element, the remote UI effect can be rendered without the application's knowledge of the preliminary interaction, hence preserving a user's privacy from the application of preliminary UI interactions.Type: ApplicationFiled: October 2, 2024Publication date: January 23, 2025Inventors: Stephen E. PINTO, Andrew T. FINKE, Abhinay ASHUTOSH, Cedric BRAY, Peter L. HAJAS, Andrew P. RICHARDSON, Yidi ZHU, James T. TURNER
-
Publication number: 20240428527Abstract: A device may include a processor configured to receive, by a system process and from an application process, a visibility preference for an object type and segment one or more physical objects associated with the object type from an image of a physical environment. The processor is also configured to determine, by the system process, whether to display the one or more segmented physical objects based at least in part on the visibility preference. In response to a determination display the one or more segmented physical objects, the processor is configured to display at least a portion of the image corresponding to the one or more segmented physical objects. In response to a determination not to display the one or more segmented physical objects, the processor is configured to forgo displaying that at least a portion of the image corresponding to the one or more segmented physical objects.Type: ApplicationFiled: October 19, 2023Publication date: December 26, 2024Inventors: Peter L. HAJAS, Sebastien METROT, Michael E. BUERLI, Michael A. REITER, Diego TRAZZI, Conner J. BROOKS, Jacob WILSON
-
Publication number: 20240404228Abstract: Some techniques are described herein for managing computer-generated environments, including methods for managing the size of virtual objects and managing an experience.Type: ApplicationFiled: March 28, 2024Publication date: December 5, 2024Inventors: Owen MONSMA, Peter L. HAJAS, James T. TURNER
-
Publication number: 20240403076Abstract: Aspects of the subject technology provide volumetric interface layers for an electronic device. The volumetric interface layers may be discrete distance layers, each having a respective distance from the electronic device, in which user interfaces can be displayed by the electronic device. Applications running on the electronic device may be provided with the ability to request display of a user interface for the application in one of the discrete distance layers. In one or more implementations, the discrete distance layers may be semantically labeled, and may be requested by an application using their semantic labels.Type: ApplicationFiled: November 15, 2023Publication date: December 5, 2024Inventors: Peter L. HAJAS, Jason M. CAHILL, Raffael HANNEMANN
-
Publication number: 20240402891Abstract: Some techniques are described herein for managing requests for placement of a user interface object within an environment. Some techniques are described herein for requesting placement of a user interface object within an environment.Type: ApplicationFiled: March 27, 2024Publication date: December 5, 2024Inventors: Florentin BEKIER, Raffael HANNEMANN, Peter L. HAJAS, James T. TURNER
-
Publication number: 20240394993Abstract: In one implementation, a method of recentering an application is performed by a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a transform between a three-dimensional application coordinate system and a three-dimensional world coordinate system. The method includes determining a location of a virtual object in the three-dimensional application coordinate system. The method includes displaying, on the display, the virtual object at a location in a two-dimensional display coordinate system based on the location of the virtual object in the three-dimensional application coordinate system, the transform, and a first pose of the device. The method includes detecting a recentering trigger. The method includes in response to detecting a recentering trigger, updating the transform to an updated transform based on a second pose of the device.Type: ApplicationFiled: May 23, 2024Publication date: November 28, 2024Inventors: John R. Hass, Peter L. Hajas, Raffael Hannemann, Reinhard Klapfer
-
Publication number: 20240220069Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: ApplicationFiled: March 18, 2024Publication date: July 4, 2024Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20240211053Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: ApplicationFiled: March 6, 2024Publication date: June 27, 2024Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20240152245Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.Type: ApplicationFiled: September 21, 2023Publication date: May 9, 2024Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
-
Patent number: 11972088Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: GrantFiled: March 22, 2023Date of Patent: April 30, 2024Assignee: Apple Inc.Inventors: James T. Turner, Peter L Hajas
-
Patent number: 11947731Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: GrantFiled: March 22, 2023Date of Patent: April 2, 2024Assignee: Apple Inc.Inventors: James T. Turner, Peter L. Hajas
-
Publication number: 20230298267Abstract: Various implementations disclosed herein more accurately or efficiently determine to which of multiple potential virtual objects user input should be directed in a 3D graphical environment. In some implementations, this involves using a rule that accounts for the types of the virtual objects to which a particular event may correspond. For example, a direction of intent may be identified and a rule used to determine to which of multiple potential virtual objects to associate an event.Type: ApplicationFiled: December 17, 2022Publication date: September 21, 2023Inventors: Charilaos PAPADOPOULOS, Aaron M. BURNS, Alexis H. PALANGIE, Andrew P. RICHARDSON, Bruno M. SOMMER, Charles MAGAHERN, Joseph P. CERRA, Justin T. VOSS, Luis R. DELIZ CENTENO, Mark A. EBBOLE, Martin GARSTENAUER, Peter L. HAJAS, Samuel L. IGLESIAS
-
Publication number: 20230229281Abstract: Aspects of the subject technology provide for constrained access to scene information by applications running on an electronic device. A system process of an electronic device may assign a region of a physical environment to an application. A user interface of the application may be displayed in the assigned region. The system process may provide scene information and/or user information to the application only when the scene information and/or user information occurs and/or originates within the assigned region, in one or more implementations.Type: ApplicationFiled: March 22, 2023Publication date: July 20, 2023Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20230229241Abstract: Aspects of the subject technology provide for intention-based user interface control for electronic devices. For example, an electronic device may utilize multiple indirect engagement indicators performed by a user of the electronic device, to confirm which of several displayed user interfaces with which the user intends to engage. Once the electronic device determines which of the multiple user interfaces the user intends to engage with, the electronic device may provide a user input to the application or other process underlying that user interface. The user input may be based, in whole or in part, one or more of the multiple indirect engagement indicators.Type: ApplicationFiled: March 22, 2023Publication date: July 20, 2023Inventors: James T. TURNER, Peter L. HAJAS
-
Publication number: 20230221830Abstract: Aspects of the subject technology provide for various user interface modes for a user interface of an application. The user interface modes may include one or more bounded modes, a single application mode such as an exclusive mode, and/or one or more full screen modes. In one or more implementations, access to various types of information by the application may be constrained based on the user interface mode of the user interface.Type: ApplicationFiled: March 22, 2023Publication date: July 13, 2023Inventors: Olivier GUTKNECHT, Peter L. HAJAS, Raffael HANNEMANN, Michael E. BUERLI, Mark L. MA
-
Publication number: 20220092847Abstract: A device implementing a system for managing multi-modal rendering of application content includes at least one processor configured to receive content, provided by an application running on a device, for display. The at least one processor is further configured to determine that the content corresponds to two-dimensional content. The at least one processor is further configured to identify a portion of the two-dimensional content for enhancement by a three-dimensional render. The at least one processor is further configured to enhance, in response to the determining, the portion of the two-dimensional content by the three-dimensional renderer. The at least one processor is further configured to provide for display of the enhanced portion of the two-dimensional content on a display of the device.Type: ApplicationFiled: December 1, 2021Publication date: March 24, 2022Inventors: Timothy R. ORIOL, Peter L. HAJAS, Daniel T. KURTZ, Edwin ISKANDAR, Charles MAGAHERN, Jeremy G. BRIDON, Naveen K. VEMURI
-
Patent number: 11195323Abstract: A device implementing a system for managing multi-modal rendering of application content includes at least one processor configured to receive content, provided by an application running on a device, for displaying in a three-dimensional display mode. The at least one processor is further configured to determine that the content corresponds to two-dimensional content. The at least one processor is further configured to identify a portion of the two-dimensional content for enhancement by a three-dimensional render. The at least one processor is further configured to enhance, in response to the determining, the portion of the two-dimensional content by the three-dimensional renderer. The at least one processor is further configured to provide for display of the enhanced portion of the two-dimensional content on a display of the device in the three-dimensional display mode.Type: GrantFiled: September 2, 2020Date of Patent: December 7, 2021Assignee: Apple Inc.Inventors: Timothy R. Oriol, Peter L. Hajas, Daniel T. Kurtz, Edwin Iskandar, Charles Magahern, Jeremy G. Bridon, Naveen K. Vemuri
-
Patent number: 11182017Abstract: An electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts with the touch-sensitive surface displays a first user interface of a first software application, detects an input on the touch-sensitive surface while displaying the first user interface, and, in response to detecting the input while displaying the first user interface, performs a first operation in accordance with a determination that the input satisfies intensity input criteria including that the input satisfies a first intensity threshold and the input remains on the touch-sensitive surface for a first predefined time period, and performs a second operation in accordance with a determination that the input satisfies tap criteria including that the input ceases to remain on the touch-sensitive surface during the first predefined time period.Type: GrantFiled: September 28, 2015Date of Patent: November 23, 2021Assignee: APPLE INC.Inventors: Chanaka G. Karunamuni, Marcos Alonso Ruiz, Jonathan R. Dascola, Olivier D. R. Gutknecht, Peter L. Hajas, Kenneth L. Kocienda, Kevin E. Ridsdale, Sophia Teutschler
-
Publication number: 20210065436Abstract: A device implementing a system for managing multi-modal rendering of application content includes at least one processor configured to receive content, provided by an application running on a device, for displaying in a three-dimensional display mode. The at least one processor is further configured to determine that the content corresponds to two-dimensional content. The at least one processor is further configured to identify a portion of the two-dimensional content for enhancement by a three-dimensional render. The at least one processor is further configured to enhance, in response to the determining, the portion of the two-dimensional content by the three-dimensional renderer. The at least one processor is further configured to provide for display of the enhanced portion of the two-dimensional content on a display of the device in the three-dimensional display mode.Type: ApplicationFiled: September 2, 2020Publication date: March 4, 2021Inventors: Timothy R. ORIOL, Peter L. HAJAS, Daniel T. KURTZ, Edwin ISKANDAR, Charles MAGAHERN, Jeremy G. BRIDON, Naveen K. VEMURI
-
Patent number: 10895954Abstract: The subject technology provides rendering an image in a first view including a plurality of tiles, each tile comprising image data corresponding to a portion of the image. The subject technology, responsive to detecting an initiation of touch input corresponding to the image, copies the image data from the plurality of tiles to a graphical canvas. The subject technology displays the image data in the graphical canvas in a second view, the graphical canvas being overlaid over at least a portion of the rendered image. The subject technology receives input stroke data corresponding to the second view, the input stroke data being continuous with the touch input. The subject technology responsive to detecting that the touch input has ended, copies the input stroke data to the plurality of tiles of the first view. Further, The subject technology displays the input stroke data and the image in the plurality of tiles of the first view.Type: GrantFiled: September 29, 2017Date of Patent: January 19, 2021Assignee: Apple Inc.Inventors: William J. Thimbleby, Peter L. Hajas, Jennifer P. Chen