Patents by Inventor Julia Schwarz

Julia Schwarz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11320957
    Abstract: A computing system is provided. The computing system includes a head mounted display (HMD) device including a display, a processor configured to execute one or more programs, and associated memory. The processor is configured to display a virtual object at least partially within a field of view of a user on the display, identify a plurality of control points associated with the virtual object, and determine that one or more of the control points associated with the virtual object are further than a predetermined threshold distance from the user. The processor is configured to, based on the determination, invoke a far interaction mode for the virtual object and receive a trigger input from the user. In response to the trigger input in the far interaction mode, the processor is configured to invoke a near interaction mode and display a virtual interaction object within the predetermined threshold distance from the user.
    Type: Grant
    Filed: March 25, 2019
    Date of Patent: May 3, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheng Kai Tang, Julia Schwarz, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Joshua Kyle Neff, Alton Kwok
  • Patent number: 11320911
    Abstract: Systems and methods are provided for detecting user-object interaction in mixed-reality environments. A mixed-reality system detects a controller gesture with an associated controller orientation in the mixed-reality environment. The mixed-reality system then determines an interaction region for the controller gesture and identifies one or more virtual objects within the interaction region. The virtual objects each have an associated orientation affinity. Subsequently, the mixed-reality system determines an orientation similarity score between the controller orientation and the orientation affinity for each virtual object within the interaction region. In response to determining that at least one orientation similarity score exceeds a predetermined threshold, the mixed-reality system executes an interaction between the controller and the virtual object that has the greatest orientation similarity score.
    Type: Grant
    Filed: March 8, 2019
    Date of Patent: May 3, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Julia Schwarz, Jason Michael Ray, Casey Leon Meekhof
  • Patent number: 11294472
    Abstract: A method for augmenting a two-stage hand gesture input comprises receiving hand tracking data for a hand of a user. A gesture recognition machine recognizes that the user has performed a first-stage gesture based on one or more parameters derived from the received hand tracking data satisfying first-stage gesture criteria. An affordance cueing a second-stage gesture is provided to the user responsive to recognizing the first-stage gesture. The gesture recognition machine recognizes that the user has performed the second-stage gesture based on one or more parameters derived from the received hand tracking data satisfying second-stage gesture criteria. A graphical user interface element is displayed responsive to recognizing the second-stage gesture.
    Type: Grant
    Filed: March 26, 2019
    Date of Patent: April 5, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheng Kai Tang, Julia Schwarz, Thomas Matthew Gable, Casey Leon Meekhof, Chuan Qin, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Ramiro S. Torres, Joshua Kyle Neff, Jamie Bryant Kirschenbaum, Neil Richard Kronlage
  • Patent number: 11262864
    Abstract: A system for classifying touch events includes a touch screen configured to display an interactive element, one or more acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more acoustic sensors and to save acoustic signals sensed by the one or more acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the acoustic signals, and an acoustic classifier configured to classify the acoustic signals.
    Type: Grant
    Filed: December 5, 2017
    Date of Patent: March 1, 2022
    Assignee: QEEXO, CO.
    Inventors: Christopher Harrison, Julia Schwarz, Robert Xiao
  • Patent number: 11249556
    Abstract: A method for single-handed microgesture input comprises receiving hand tracking data for a hand of a user. A set of microgesture targets that include software functions are assigned to positions along a length of a first finger. A visual affordance including indicators for two or more assigned microgesture targets is provided to the user. The received hand tracking data is analyzed by a gesture recognition machine. A location of a thumbtip of the hand of the user is determined relative to the positions along the first finger. Responsive to determining that the thumbtip is within a threshold distance of the first finger at a first position along the length of the first finger, an indicator for a corresponding first microgesture target is augmented, and then further augmented based on a duration the thumbtip is at the first position. Responsive to detecting a confirmation action, the corresponding microgesture target executes.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: February 15, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Julia Schwarz, Noe Moreno Barragan, Michael Harley Notter, Sheng Kai Tang, Joshua Kyle Neff
  • Publication number: 20210383594
    Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.
    Type: Application
    Filed: August 23, 2021
    Publication date: December 9, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Sheng Kai TANG, Julia SCHWARZ, Jason Michael RAY, Sophie STELLMACH, Thomas Matthew GABLE, Casey Leon MEEKHOF, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Ramiro S. TORRES, Kevin John APPEL, Jamie Bryant KIRSCHENBAUM
  • Publication number: 20210373672
    Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.
    Type: Application
    Filed: May 29, 2020
    Publication date: December 2, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
  • Patent number: 11107265
    Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.
    Type: Grant
    Filed: March 11, 2019
    Date of Patent: August 31, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheng Kai Tang, Julia Schwarz, Jason Michael Ray, Sophie Stellmach, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Ramiro S. Torres, Kevin John Appel, Jamie Bryant Kirschenbaum
  • Publication number: 20210255767
    Abstract: The discussion relates to virtual keyboard engagement. One example can define key volumes relating to keys of a virtual keyboard and detect finger movement of a user through individual key volumes. The example can detect parameter changes associated with detected finger movement through individual key volumes and build potential key sequences from detected parameter changes.
    Type: Application
    Filed: May 5, 2021
    Publication date: August 19, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Christopher M. BECKER, Nazeeh A. ELDIRGHAMI, Kevin W. BARNES, Julia SCHWARZ, Eric CARTER
  • Patent number: 11048355
    Abstract: A method and apparatus for determining pitch and yaw of an elongated interface object as it interacts with a touchscreen surface. A touch image is received, and this touch image has at least a first area that corresponds to an area of the touchscreen that has an elongated interface object positioned at least proximate to it. The elongated interface object has a pitch and a yaw with respect to the touchscreen surface. A first transformation is performed to obtain a first transformation image of the touch image, and a second transformation is performed to obtain a second transformation image of the touch image. The first transformation differs from the second transformation. The yaw is determined for the elongated interface object based on both the first and second transformation images. The pitch is determined based on at least one of the first and second transformation images.
    Type: Grant
    Filed: July 26, 2017
    Date of Patent: June 29, 2021
    Assignee: QEEXO, CO.
    Inventors: Christopher Harrison, Julia Schwarz, Robert Bo Xiao
  • Patent number: 11029845
    Abstract: The discussion relates to virtual keyboard engagement. One example can define key volumes relating to keys of a virtual keyboard and detect finger movement of a user through individual key volumes. The example can detect parameter changes associated with detected finger movement through individual key volumes and build potential key sequences from detected parameter changes.
    Type: Grant
    Filed: July 11, 2019
    Date of Patent: June 8, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christopher M. Becker, Nazeeh A. Eldirghami, Kevin W. Barnes, Julia Schwarz, Eric Carter
  • Patent number: 11029785
    Abstract: A method of classifying touch screen events uses known non-random patterns of touch events over short periods of time to increase the accuracy of analyzing such events. The method takes advantage of the fact that after one touch event, certain actions are more likely to follow than others. Thus if a touch event is classified as a knock, and then within 500 ms a new event in a similar location occurs, but the classification confidence is low (e.g., 60% nail, 40% knuckle), the classifier may add weight to the knuckle classification since this touch sequence is far more likely. Knowledge about the probabilities of follow-on touch events can be used to bias subsequent classification, adding weight to particular events.
    Type: Grant
    Filed: February 21, 2020
    Date of Patent: June 8, 2021
    Assignee: QEEXO, CO.
    Inventors: Julia Schwarz, Christopher Harrison
  • Patent number: 10969957
    Abstract: An electronic device includes a touch-sensitive surface, for example a touch pad or touch screen. The user interacts with the touch-sensitive surface, producing touch interactions. The resulting actions taken depend at least in part on the touch type. For example, the same touch interactions performed by three different touch types of a finger pad, a finger nail and a knuckle, may result in the execution of different actions.
    Type: Grant
    Filed: February 12, 2020
    Date of Patent: April 6, 2021
    Assignee: QEEXO, CO.
    Inventors: Christopher Harrison, Julia Schwarz, Robert Bo Xiao
  • Patent number: 10969937
    Abstract: Systems and methods are provided for controlling the position of an interactive movable menu in a mixed-reality environment. In some instances, a mixed-reality display device presents a mixed-reality environment to a user. The mixed-reality device then detects a first gesture associated with a user controller while presenting the mixed-reality environment and, in response to the first gesture, triggers a display of an interactive movable menu within the mixed-reality environment as a tethered hologram that is dynamically moved within the mixed-reality environment relative to and corresponding with movement of the user controller within the mixed-reality environment. Then, in response to a second detected gesture, the mixed-reality device selectively locks a display of the interactive movable menu at a fixed position that is not tethered to the user controller.
    Type: Grant
    Filed: March 11, 2019
    Date of Patent: April 6, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Julia Schwarz, Casey Leon Meekhof, Alon Farchy, Sheng Kai Tang, Nicholas F. Kamuda
  • Patent number: 10949029
    Abstract: A system for classifying touch events of different interaction layers includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier is configured to classify the vibro-acoustic signals and activate corresponding functions in the different layers dependent upon which finger part is used.
    Type: Grant
    Filed: January 15, 2017
    Date of Patent: March 16, 2021
    Assignee: QEEXO, CO.
    Inventors: Chris Harrison, Julia Schwarz, Leandro Damian Zungri
  • Publication number: 20210041971
    Abstract: A system for classifying touch events of different interaction layers includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier is configured to classify the vibro-acoustic signals and activate corresponding functions in the different layers dependent upon which finger part is used.
    Type: Application
    Filed: January 15, 2017
    Publication date: February 11, 2021
    Inventors: CHRIS HARRISON, JULIA SCHWARZ, LEANDRO DAMIAN ZUNGRI
  • Patent number: 10901495
    Abstract: Systems and methods are provided for detecting user input in a mixed-reality environment being rendered with one or more holograms. An input receiver that includes a plurality of input elements is presented. An input controller that includes the individual actuators is identified, wherein each corresponding actuator of the individual actuators is configured to, when interacting with one or more input elements of the input receiver that are mapped to the corresponding actuator and when the input state of the corresponding actuator is an active state, provide user input within the mixed-reality environment. Subsequently, the presence of a triggering attribute of the input controller is detected, and in response to detecting the triggering attribute, the input state of a corresponding actuator is changed from an inactive state to an active state for providing user input.
    Type: Grant
    Filed: March 2, 2019
    Date of Patent: January 26, 2021
    Assignee: MICROSOFTTECHNOLOGY LICENSING, LLC
    Inventor: Julia Schwarz
  • Publication number: 20210011621
    Abstract: The discussion relates to virtual keyboard engagement. One example can define key volumes relating to keys of a virtual keyboard and detect finger movement of a user through individual key volumes. The example can detect parameter changes associated with detected finger movement through individual key volumes and build potential key sequences from detected parameter changes.
    Type: Application
    Filed: July 11, 2019
    Publication date: January 14, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Christopher M. BECKER, Nazeeh A. ELDIRGHAMI, Kevin W. BARNES, Julia SCHWARZ, Eric CARTER
  • Patent number: 10890967
    Abstract: A method for improving user interaction with a virtual environment includes presenting the virtual environment to a user on a display, measuring a gaze location of a user's gaze relative to the virtual environment, casting an input ray from an input device, measuring an input ray location at a distal point of the input ray, and snapping a presented ray location to the gaze location when the input ray location is within a snap threshold distance of the input ray location.
    Type: Grant
    Filed: July 9, 2018
    Date of Patent: January 12, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sophie Stellmach, Sheng Kai Tang, Casey Leon Meekhof, Julia Schwarz, Nahil Tawfik Sharkasi, Thomas Matthew Gable
  • Patent number: 10831265
    Abstract: A method for improving user interaction with a virtual environment includes measuring a first position of the user's gaze relative to a virtual element, selecting the virtual element in the virtual environment at an origin when the user's gaze overlaps the virtual element, measuring a second position of a user's gaze relative to the virtual element, presenting a visual placeholder at a second position of the user's gaze when the second position of the user's gaze is beyond a threshold distance from the origin, and moving the visual placeholder relative to a destination using a secondary input device.
    Type: Grant
    Filed: April 20, 2018
    Date of Patent: November 10, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sophie Stellmach, Casey Leon Meekhof, Julia Schwarz