Patents by Inventor Benjamin R. Blachnitzky

Benjamin R. Blachnitzky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230325047
    Abstract: A method includes displaying a plurality of computer-generated objects, including a first computer-generated object at a first position within an environment and a second computer-generated object at a second position within the environment. The first computer-generated object corresponds to a first user interface element that includes a first set of controls for modifying a content item. The method includes, while displaying the plurality of computer-generated objects, obtaining extremity tracking data. The method includes moving the first computer-generated object from the first position to a third position within the environment based on the extremity tracking data. The method includes, in accordance with a determination that the third position satisfies a proximity threshold with respect to the second position, merging the first computer-generated object with the second computer-generated object in order to generate a third computer-generated object for modifying the content item.
    Type: Application
    Filed: March 15, 2023
    Publication date: October 12, 2023
    Inventors: Nicolai Georg, Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky
  • Patent number: 11782548
    Abstract: Detecting a touch includes receiving image data of a touching object of a user selecting selectable objects of a target surface, determining a rate of movement of the touching object, in response to determining that the rate of movement satisfies a predetermined threshold, modifying a touch detection parameter for detecting a touch event between the touching object and the target surface, and detecting one or more additional touch events using the modified touch detection parameter.
    Type: Grant
    Filed: March 24, 2021
    Date of Patent: October 10, 2023
    Assignee: Apple Inc.
    Inventors: Lejing Wang, Benjamin R. Blachnitzky, Lilli I. Jonsson, Nicolai Georg
  • Publication number: 20230315202
    Abstract: A method includes displaying a plurality of computer-generated objects, and obtaining finger manipulation data from a finger-wearable device via a communication interface. In some implementations, the method includes receiving an untethered input vector that includes a plurality of untethered input indicator values. Each of the plurality of untethered input indicator values is associated with one of a plurality of untethered input modalities. In some implementations, the method includes obtaining proxy object manipulation data from a physical proxy object via the communication interface. The proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object. The method includes registering an engagement event with respect to a first one of the plurality of computer-generated objects based on a combination of the finger manipulation data, the untethered input vector, and the proxy object manipulation data.
    Type: Application
    Filed: February 27, 2023
    Publication date: October 5, 2023
    Inventors: Adam G. Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20230297168
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, and a communication interface provided to communicate with a finger-wearable device. The method includes, while displaying a plurality of content items on the display, obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes selecting a first one of the plurality of content items based on a first portion of the finger manipulation data in combination with satisfaction of a proximity threshold. The first one of the plurality of content items is associated with a first dimensional representation. The method includes changing the first one of the plurality of content items from the first dimensional representation to a second dimensional representation based on a second portion of the finger manipulation data.
    Type: Application
    Filed: March 15, 2023
    Publication date: September 21, 2023
    Inventors: Nicolai Georg, Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky
  • Publication number: 20230297172
    Abstract: A method includes, while displaying a computer-generated object at a first position within an environment, obtaining extremity tracking data from an extremity tracker. The first position is outside of a drop region that is viewable using the display. The method includes moving the computer-generated object from the first position to a second position within the environment based on the extremity tracking data. The method includes, in response to determining that the second position satisfies a proximity threshold with respect to the drop region, detecting an input that is associated with a spatial region of the environment. The method includes moving the computer-generated object from the second position to a third position that is within the drop region, based on determining that the spatial region satisfies a focus criterion associated with the drop region.
    Type: Application
    Filed: March 21, 2023
    Publication date: September 21, 2023
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Jordan A. Cazamias, Nicolai Georg
  • Publication number: 20230095282
    Abstract: In one implementation, a method for displaying a first pairing affordance that is world-locked to a first peripheral device. The method may be performed by an electronic device including a non-transitory memory, one or more processors, a display, and one or more input devices. The method includes detecting the first peripheral device within a three-dimensional (3D) environment via a computer vision technique. The method includes receiving, via the one or more input devices, a first user input that is directed to the first peripheral device within the 3D environment. The method includes, in response to receiving the first user input, displaying, on the display, the first pairing affordance that is world-locked to the first peripheral device within the 3D environment.
    Type: Application
    Filed: September 22, 2022
    Publication date: March 30, 2023
    Inventors: Benjamin R. Blachnitzky, Aaron M. Burns, Anette L. Freiin von Kapri, Arun Rakesh Yoganandan, Benjamin H. Boesel, Evgenii Krivoruchko, Jonathan Ravasz, Shih-Sang Chiu
  • Patent number: 11360558
    Abstract: A system may include finger devices. A touch sensor may be mounted in a finger device housing to gather input from an external object as the object moves along an exterior surface of the housing. The touch sensor may include capacitive sensor electrodes. Sensors such as force sensors, ultrasonic sensors, inertial measurement units, optical sensors, and other components may be used in gathering finger input from a user. Finger input from a user may be used to manipulate virtual objects in a mixed reality or virtual reality environment while a haptic output device in a finger device provides associated haptic output. A user may interact with real-world objects while computer-generated content is overlaid over some or all of the objects. Object rotations and other movements may be converted into input for a mixed reality or virtual reality system using force measurements or other sensors measurements made with the finger devices.
    Type: Grant
    Filed: April 26, 2019
    Date of Patent: June 14, 2022
    Assignee: Apple Inc.
    Inventors: Paul X. Wang, Nicolai Georg, Benjamin R. Blachnitzky, Alhad A. Palkar, Minhazul Islam, Alex J. Lehmann, Madeleine S. Cordier, Joon-Sup Han, Hongcheng Sun, Sang E. Lee, Kevin Z. Lo, Lilli Ing-Marie Jonsson, Luis Deliz Centeno, Yuhao Pan, Stephen E. Dey, Paul N. DuMontelle, Jonathan C. Atler, Tianjia Sun, Jian Li, Chang Zhang
  • Patent number: 11287886
    Abstract: A finger device may be worn on a user's finger and may serve as a controller for a head-mounted device or other electronic device. The finger device may have a housing having an upper housing portion that extends across a top of the finger and first and second side housing portions that extend down respective first and second sides of the finger. Displacement sensors in the side housing portions may measure movements of the sides of the finger as the finger contacts an external surface. To account for variations in skin compliance, the finger device and/or the electronic device may store calibration data that maps displacement values gathered by the displacement sensors to force values. The calibration data may be based on calibration measurements gathered while the user's finger wearing the finger device contacts a force sensor in an external electronic device.
    Type: Grant
    Filed: September 15, 2020
    Date of Patent: March 29, 2022
    Assignee: Apple Inc.
    Inventors: Adrian Z. Harb, Benjamin R. Blachnitzky, Ahmet Fatih Cihan, Stephen E. Dey, Mengshu Huang, Mark T. Winkler
  • Publication number: 20200026352
    Abstract: A system may include finger devices. A touch sensor may be mounted in a finger device housing to gather input from an external object as the object moves along an exterior surface of the housing. The touch sensor may include capacitive sensor electrodes. Sensors such as force sensors, ultrasonic sensors, inertial measurement units, optical sensors, and other components may be used in gathering finger input from a user. Finger input from a user may be used to manipulate virtual objects in a mixed reality or virtual reality environment while a haptic output device in a finger device provides associated haptic output. A user may interact with real-world objects while computer-generated content is overlaid over some or all of the objects. Object rotations and other movements may be converted into input for a mixed reality or virtual reality system using force measurements or other sensors measurements made with the finger devices.
    Type: Application
    Filed: April 26, 2019
    Publication date: January 23, 2020
    Inventors: Paul X. Wang, Nicolai Georg, Benjamin R. Blachnitzky, Alhad A. Palkar, Minhazul Islam, Alex J. Lehmann, Madeleine S. Cordier, Joon-Sup Han, Hongcheng Sun, Sang E. Lee, Kevin Z. Lo, Lilli Ing-Marie Jonsson, Luis Deliz Centeno, Yuhao Pan, Stephen E. Dey, Paul N. DuMontelle, Jonathan C. Atler, Tianjia Sun, Jian Li, Chang Zhang