Patents by Inventor Julia Schwarz

Julia Schwarz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180107333
    Abstract: An apparatus classifies touch events. The apparatus includes a touch sensitive surface configured to generate a touch event when an object touches the touch sensitive surface. The touch event entails a mechanical vibration upon contact with the surface. The apparatus includes a touch event detector configured to detect the onset of a touch, and a touch event classifier configured to classify the touch event to identify the object used for the touch event. The mechanical vibration is created via any one of finger parts including a tip, a pad, a fingernail, and a knuckle, each of which has a unique feature different from each other.
    Type: Application
    Filed: December 8, 2017
    Publication date: April 19, 2018
    Inventors: Christopher Harrison, Julia Schwarz, Scott E. Hudson
  • Publication number: 20180095595
    Abstract: A system for classifying touch events includes a touch screen configured to display an interactive element, one or more acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more acoustic sensors and to save acoustic signals sensed by the one or more acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the acoustic signals, and an acoustic classifier configured to classify the acoustic signals.
    Type: Application
    Filed: December 5, 2017
    Publication date: April 5, 2018
    Inventors: Christopher Harrison, Julia Schwarz, Robert Xiao
  • Publication number: 20180046245
    Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.
    Type: Application
    Filed: August 11, 2016
    Publication date: February 15, 2018
    Inventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
  • Patent number: 9864453
    Abstract: Methods and apparatus of embodiments of the present invention include a classification system configured to treat edge contact of a touch screen as a separate class of touch events such that any touches occurring near the edge of the touch screen are to be processed by a classifier that is configured to process edge contacts as compared to a classifier that is configured to process other contacts that may occur in the approximate middle of the touch screen which may be wholly digitized. An apparatus may employ two separate and distinct classifiers, including a full touch classifier and an edge touch classifier. The touch screen may be configured to have two different sensing regions to determine which of the two classifiers is appropriate for a touch event.
    Type: Grant
    Filed: September 22, 2014
    Date of Patent: January 9, 2018
    Assignee: QEEXO, CO.
    Inventors: Taihei Munemoto, Julia Schwarz, Chris Harrison
  • Patent number: 9864454
    Abstract: A system for classifying touch events includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier configured to classify the vibro-acoustic signals.
    Type: Grant
    Filed: February 2, 2015
    Date of Patent: January 9, 2018
    Assignee: QEEXO, CO.
    Inventors: Christopher Harrison, Julia Schwarz, Robert Xiao
  • Publication number: 20180004319
    Abstract: A system for classifying touch events of different interaction layers includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier is configured to classify the vibro-acoustic signals and activate corresponding functions in the different layers dependent upon which finger part is used.
    Type: Application
    Filed: January 15, 2017
    Publication date: January 4, 2018
    Inventors: CHRIS HARRISON, JULIA SCHWARZ, LEANDRO DAMIAN ZUNGRI
  • Patent number: 9851841
    Abstract: An apparatus classifies touch events. The apparatus includes a touch sensitive surface configured to generate a touch event when an object touches the touch sensitive surface. The touch event entails a mechanical vibration upon contact with the surface. The apparatus includes a touch event detector configured to detect the onset of a touch, and a touch event classifier configured to classify the touch event to identify the object used for the touch event. The mechanical vibration is created via any one of finger parts including a tip, a pad, a fingernail, and a knuckle, each of which has a unique feature different from each other.
    Type: Grant
    Filed: July 11, 2016
    Date of Patent: December 26, 2017
    Assignee: CARNEGIE MELLON UNIVERSITY
    Inventors: Christopher Harrison, Julia Schwarz, Scott E. Hudson
  • Publication number: 20170358144
    Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
    Type: Application
    Filed: June 13, 2016
    Publication date: December 14, 2017
    Inventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert CJ Pengelly, Yasaman Sheri, James Tichenor
  • Patent number: 9785228
    Abstract: An NUI system to provide user input to a computer system. The NUI system includes a logic machine and an instruction-storage machine. The instruction-storage machine holds instructions that, when executed by the logic machine, cause the logic machine to detect an engagement gesture from a human subject or to compute an engagement metric reflecting the degree of the subject's engagement. The instructions also cause the logic machine to direct gesture-based user input from the subject to the computer system as soon as the engagement gesture is detected or the engagement metric exceeds a threshold.
    Type: Grant
    Filed: February 11, 2013
    Date of Patent: October 10, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Eduardo Escardo Raffo, Oscar Murillo, David Bastien, Matthew H. Ahn, Mauro Giusti, Kevin Endres, Christian Klein, Julia Schwarz, Charles Claudius Marais
  • Patent number: 9778783
    Abstract: A method and apparatus for determining pitch and yaw of an elongated interface object as it interacts with a touchscreen surface. A touch image is received, and this touch image has at least a first area that corresponds to an area of the touchscreen that has an elongated interface object positioned at least proximate to it. The elongated interface object has a pitch and a yaw with respect to the touchscreen surface. A first transformation is performed to obtain a first transformation image of the touch image, and a second transformation is performed to obtain a second transformation image of the touch image. The first transformation differs from the second transformation. The yaw is determined for the elongated interface object based on both the first and second transformation images. The pitch is determined based on at least one of the first and second transformation images.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: October 3, 2017
    Assignee: QEEXO, Co.
    Inventors: Christopher Harrison, Julia Schwarz, Robert Bo Xiao
  • Patent number: 9612689
    Abstract: A system for classifying touch events of different interaction layers includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier is configured to classify the vibro-acoustic signals and activate corresponding functions in the different layers dependent upon which finger part is used.
    Type: Grant
    Filed: June 26, 2015
    Date of Patent: April 4, 2017
    Assignee: Qeexo, Co.
    Inventors: Chris Harrison, Julia Schwarz, Leandro Damian Zungri
  • Publication number: 20170024073
    Abstract: The present invention is a palm rejection technique utilizing temporal features, iterative classification, and probabilistic voting. Touch events are classified based on features periodically extracted from time windows of increasing size, always centered at the birth of the event. The classification process uses a series of decision trees acting on said features.
    Type: Application
    Filed: April 14, 2015
    Publication date: January 26, 2017
    Inventors: Julia Schwarz, Christopher Harrison
  • Publication number: 20170024892
    Abstract: Methods and apparatuses are provided for determining a pitch and yaw of an elongated interface object relative to a proximity sensitive surface. In one aspect, a proximity image is received having proximity image data from which it can be determined which areas of the proximity sensitive surface sensed the elongated interface object during a period of time. A proximity blob is identified in the proximity image and the proximity image is transformed using a plurality of different transformations to obtain a plurality of differently transformed proximity images. A plurality of features is determined for the identified blob in the transformed proximity images and the pitch of the elongated interface object relative to the proximity sensitive surface is determined based upon the determined features and a multi-dimensional heuristic regression model of the proximity sensitive surface; and a yaw is determined based upon the pitch.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 26, 2017
    Inventors: CHRISTOPHER HARRISON, JULIA SCHWARZ, ROBERT BO XIAO
  • Publication number: 20170024055
    Abstract: Some embodiments of the present invention include a method of differentiating touch screen users based on characterization of features derived from the touch event acoustics and mechanical impact and includes detecting a touch event on a touch sensitive surface, generating a vibro-acoustic waveform signal using at least one sensor detecting such touch event, converting the waveform signal into at least a domain signal, extracting distinguishing features from said domain signal, and classifying said features to associate the features of the domain signal with a particular user.
    Type: Application
    Filed: March 21, 2016
    Publication date: January 26, 2017
    Inventors: JULIA SCHWARZ, CHRIS HARRISON
  • Publication number: 20160320905
    Abstract: An apparatus classifies touch events. The apparatus includes a touch sensitive surface configured to generate a touch event when an object touches the touch sensitive surface. The touch event entails a mechanical vibration upon contact with the surface. The apparatus includes a touch event detector configured to detect the onset of a touch, and a touch event classifier configured to classify the touch event to identify the object used for the touch event. The mechanical vibration is created via any one of finger parts including a tip, a pad, a fingernail, and a knuckle, each of which has a unique feature different from each other.
    Type: Application
    Filed: July 11, 2016
    Publication date: November 3, 2016
    Inventors: Christopher Harrison, Julia Schwarz, Scott E. Hudson
  • Publication number: 20160299615
    Abstract: Touch sensitive devices, methods and computer readable recording mediums are provided that allow for improved classification of objects against a touch sensitive surface of a touch sensitive device based upon analysis of subdivisions of data representing contact with the touch sensitive surface during a period of time.
    Type: Application
    Filed: April 12, 2015
    Publication date: October 13, 2016
    Inventors: JULIA SCHWARZ, ROBERT BO XIAO, CHRIS HARRISON
  • Patent number: 9465494
    Abstract: An apparatus classifies touch events. The apparatus includes a touch sensitive surface configured to generate a touch event when an object touches the touch sensitive surface. The touch event entails a mechanical vibration upon contact with the surface. The apparatus includes a touch event detector configured to detect the onset of a touch, and a touch event classifier configured to classify the touch event to identify the object used for the touch event. The mechanical vibration is created via any one of finger parts including a tip, a pad, a fingernail, and a knuckle, each of which has a unique feature different from each other.
    Type: Grant
    Filed: April 1, 2014
    Date of Patent: October 11, 2016
    Assignee: Carnegie Mellon University
    Inventors: Christopher Harrison, Julia Schwarz, Scott E. Hudson
  • Publication number: 20160231865
    Abstract: A method and apparatus for determining pitch and yaw of an elongated interface object as it interacts with a touchscreen surface. A touch image is received, and this touch image has at least a first area that corresponds to an area of the touchscreen that has an elongated interface object positioned at least proximate to it. The elongated interface object has a pitch and a yaw with respect to the touchscreen surface. A first transformation is performed to obtain a first transformation image of the touch image, and a second transformation is performed to obtain a second transformation image of the touch image. The first transformation differs from the second transformation. The yaw is determined for the elongated interface object based on both the first and second transformation images. The pitch is determined based on at least one of the first and second transformation images.
    Type: Application
    Filed: September 30, 2015
    Publication date: August 11, 2016
    Inventors: CHRISTOPHER HARRISON, JULIA SCHWARZ, ROBERT BO XIAO
  • Publication number: 20160224145
    Abstract: A system for classifying touch events of different interaction layers includes a touch screen configured to display an interactive element, one or more vibro-acoustic sensors coupled to the touch screen, a touch event detector configured to monitor the one or more vibro-acoustic sensors and to save vibro-acoustic signals sensed by the one or more vibro acoustic sensors, wherein the touch event detector is further configured to detect touch events in which the interactive element is touched by a first or a second finger part of a user, and wherein the touch events result in generating the vibro-acoustic signals, and a vibro-acoustic classifier is configured to classify the vibro-acoustic signals and activate corresponding functions in the different layers dependent upon which finger part is used.
    Type: Application
    Filed: June 26, 2015
    Publication date: August 4, 2016
    Inventors: CHRIS HARRISON, JULIA SCHWARZ, LEANDRO DAMIAN ZUNGRI
  • Publication number: 20160162552
    Abstract: Described herein are various technologies pertaining to presenting search results to a user, wherein the search results are messages generated by way of social networking applications. An interactive graphical object is presented together with retrieved messages, and messages are filtered responsive to interactions with the interactive graphical object. Additionally, a graphical object that is indicative of credibility of a message is presented together with the message.
    Type: Application
    Filed: February 12, 2016
    Publication date: June 9, 2016
    Inventors: Meredith June Morris, Scott Joseph Counts, Asta Jane Roseway, Julia Schwarz