Patents by Inventor Pol Pla

Pol Pla has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210279966
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments comprising visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display of the electronic device. The environment comprising visual representations corresponding to a plurality of applications. A first application among the plurality of applications is designated as an elevated application. The elevated application is provided with access to a control parameter configured to modify an ambience of the environment. Other applications of the plurality of applications are restricted from accessing the control parameter while the first application is designated as the elevated application.
    Type: Application
    Filed: February 17, 2021
    Publication date: September 9, 2021
    Inventors: Aaron M. Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Publication number: 20210240331
    Abstract: In an exemplary process for selecting a text input field using an eye gaze, a graphical object including the text input field is displayed. The text input field is associated with one or more respective locations on one or more displays. Characteristics of an eye gaze are determined using gaze sensors, and a gaze location is determined using the characteristics. Input is received from an input device corresponding to one or more text characters. If the gaze location corresponds to the one or more respective locations, then the one or more text characters are displayed in the text input field. If the gaze location does not correspond to the one or more respective locations, then the one or more text characters are not displayed in the text input field.
    Type: Application
    Filed: April 24, 2019
    Publication date: August 5, 2021
    Inventors: Earl M. OLSON, Pol PLA I. CONESA, Aaron P. THOMPSON
  • Publication number: 20210096726
    Abstract: While displaying a three-dimensional environment, a computer system detects a hand at a first position that corresponds to a portion of the three-dimensional environment. In response to detecting the hand at the first position: in accordance with a determination that the hand is being held in a first predefined configuration, the computer system displays a visual indication of a first operation context for gesture input using hand gestures in the three-dimensional environment; and in accordance with a determination that the hand is not being held in the first predefined configuration, the computer system forgoes display of the visual indication.
    Type: Application
    Filed: September 23, 2020
    Publication date: April 1, 2021
    Inventors: Jeffrey M. Faulkner, Israel Pastrana Vicente, Pol Pla I. Conesa, Stephen O. Lemay
  • Publication number: 20210097776
    Abstract: While displaying a three-dimensional scene including at least a first virtual object displayed with a first value corresponding to a first portion and a second value corresponding to a second portion of the first virtual object at a first location and a first physical surface at a second location, a computer system generates a first visual effect at the second location of the three-dimensional scene, including modifying a visual appearance of a first portion of the first physical surface in the three-dimensional scene in accordance with the first value for the first display property that corresponds to the first portion of the first virtual object; and modifying a visual appearance of a second portion of the first physical surface in the three-dimensional scene in accordance with the second value for the first display property that corresponds to the second portion of the first virtual object.
    Type: Application
    Filed: September 23, 2020
    Publication date: April 1, 2021
    Inventors: Jeffrey M. Faulkner, Israel Pastrana Vicente, Philipp Rockel, Wesley M. Holder, Pol Pla I. Conesa, Nicholas W. Henderson, Robert T. Tilton, Stephen O. Lemay
  • Patent number: 10948980
    Abstract: An electronic device may have a housing configured to be worn on a user's body or held in a user's hand. The electronic device may have control circuitry that wirelessly controls external equipment such as equipment with a display. By gathering motion information and other user input and wirelessly transmitting this information to the external equipment, the electronic device may serve as a wireless controller that controls content on the display. The electronic device may have multiple structures that move relative to each other such as first and second housing portions. The second housing portion may move to an extended position where the gathering of sensor information on changes in user finger position as the user interacts with real-world objects is enhanced.
    Type: Grant
    Filed: March 24, 2020
    Date of Patent: March 16, 2021
    Assignee: Apple Inc.
    Inventors: Paul X. Wang, Pol Pla I Conesa
  • Publication number: 20200356162
    Abstract: An electronic device may have a housing configured to be worn on a user's body or held in a user's hand. The electronic device may have control circuitry that wirelessly controls external equipment such as equipment with a display. By gathering motion information and other user input and wirelessly transmitting this information to the external equipment, the electronic device may serve as a wireless controller that controls content on the display. The electronic device may have multiple structures that move relative to each other such as first and second housing portions. The second housing portion may move to an extended position where the gathering of sensor information on changes in user finger position as the user interacts with real-world objects is enhanced.
    Type: Application
    Filed: March 24, 2020
    Publication date: November 12, 2020
    Inventors: Paul X. Wang, Pol Pla I Conesa
  • Publication number: 20190369714
    Abstract: The present disclosure relates to techniques for displaying representations of physical input devices and overlaying visual features on the representations of physical input devices in a computer-generated reality (CGR) environment. The techniques include displaying a virtual application in a CGR environment and, in response to detecting an input field in the displayed virtual application, displaying at least a portion of the displayed application on a representation of a physical input device. The at least a portion of the displayed application includes the detected input field. In response to detecting an input received at the physical input device, the input field is updated with the input, and the updated input field is displayed.
    Type: Application
    Filed: May 13, 2019
    Publication date: December 5, 2019
    Inventors: Pol PLA I. CONESA, Earl M. OLSON, Aaron P. Thompson
  • Patent number: 10298412
    Abstract: In one embodiment, one or more systems may receive input from a user identifying an interactive region of a physical environment. The one or more systems may determine a location of the interactive region relative to a depth sensor and monitor, at least in part by the depth sensor, the interactive region for a predetermined event. The one or more systems may detect, at least in part by the depth sensor, the predetermined event. In response to detecting the predetermined event, the one or more systems may initiate a predetermined action associated with the predetermined event.
    Type: Grant
    Filed: July 19, 2018
    Date of Patent: May 21, 2019
    Assignee: Samsung Electronics Company, Ltd.
    Inventors: Brian Harms, Pol Pla, Yedan Qian, Olivier Bau
  • Publication number: 20180323992
    Abstract: In one embodiment, one or more systems may receive input from a user identifying an interactive region of a physical environment. The one or more systems may determine a location of the interactive region relative to a depth sensor and monitor, at least in part by the depth sensor, the interactive region for a predetermined event. The one or more systems may detect, at least in part by the depth sensor, the predetermined event. In response to detecting the predetermined event, the one or more systems may initiate a predetermined action associated with the predetermined event.
    Type: Application
    Filed: July 19, 2018
    Publication date: November 8, 2018
    Inventors: Brian Harms, Pol Pla, Yedan Qian, Olivier Bau
  • Patent number: 10057078
    Abstract: In one embodiment, one or more systems may receive input from a user identifying an interactive region of a physical environment. The one or more systems may determine a location of the interactive region relative to a depth sensor and monitor, at least in part by the depth sensor, the interactive region for a predetermined event. The one or more systems may detect, at least in part by the depth sensor, the predetermined event. In response to detecting the predetermined event, the one or more systems may initiate a predetermined action associated with the predetermined event.
    Type: Grant
    Filed: February 3, 2016
    Date of Patent: August 21, 2018
    Assignee: Samsung Electronics Company, Ltd.
    Inventors: Brian Harms, Pol Pla, Yedan Qian, Olivier Bau
  • Publication number: 20170054569
    Abstract: In one embodiment, one or more systems may receive input from a user identifying an interactive region of a physical environment. The one or more systems may determine a location of the interactive region relative to a depth sensor and monitor, at least in part by the depth sensor, the interactive region for a predetermined event. The one or more systems may detect, at least in part by the depth sensor, the predetermined event. In response to detecting the predetermined event, the one or more systems may initiate a predetermined action associated with the predetermined event.
    Type: Application
    Filed: February 3, 2016
    Publication date: February 23, 2017
    Inventors: Brian Harms, Pol Pla, Yedan Qian, Olivier Bau