Patents by Inventor Jacob O. Wobbrock

Jacob O. Wobbrock has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10540083
    Abstract: A system for classifying a user touch event by a user interacting with a device as an intended key is provided. For different hand postures (e.g., holding device with right hand and entering text with right thumb), the system provides a touch pattern model indicating how the user interacts using that hand posture. The system receives an indication of a user touch event and identifies the hand posture of the user. The system then determines the intended key based on the user touch event and a touch pattern model for the identified hand posture. A system is also provided for determining the amount a presser a user is applying to the device based on dampening of vibrations as measured by an inertial sensor. A system is provided that uses motion of the device as measured by an inertial sensor to improve the accuracy of text entry.
    Type: Grant
    Filed: December 11, 2013
    Date of Patent: January 21, 2020
    Assignee: University of Washington
    Inventors: Mayank Goel, Jacob O. Wobbrock, Shwetak N. Patel, Leah Findlater
  • Publication number: 20180046704
    Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.
    Type: Application
    Filed: October 2, 2017
    Publication date: February 15, 2018
    Inventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
  • Patent number: 9811583
    Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.
    Type: Grant
    Filed: June 18, 2012
    Date of Patent: November 7, 2017
    Assignee: University of Washington Through Its Center for Commercialization
    Inventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
  • Publication number: 20150317076
    Abstract: A system for classifying a user touch event by a user interacting with a device as an intended key is provided. For different hand postures (e.g., holding device with right hand and entering text with right thumb), the system provides a touch pattern model indicating how the user interacts using that hand posture. The system receives an indication of a user touch event and identifies the hand posture of the user. The system then determines the intended key based on the user touch event and a touch pattern model for the identified hand posture. A system is also provided for determining the amount a presser a user is applying to the device based on dampening of vibrations as measured by an inertial sensor. A system is provided that uses motion of the device as measured by an inertial sensor to improve the accuracy of text entry.
    Type: Application
    Filed: December 11, 2013
    Publication date: November 5, 2015
    Inventors: Mayank Goel, Jacob O. Wobbrock, Shwetak N. Patel, Leah Findlater
  • Publication number: 20140149432
    Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.
    Type: Application
    Filed: June 18, 2012
    Publication date: May 29, 2014
    Applicant: UNIVERSITY OF WASHINGTON THROUGH ITS CENTER FOR COMMERCIALIZATION
    Inventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
  • Patent number: 7729542
    Abstract: A new unistroke text entry method for handheld or wearable devices is designed to provide high accuracy and stability of motion. The user makes characters by traversing the edges and diagonals of a geometric pattern, e.g. a square, imposed over the usual text input area. Gesture recognition is accomplished not through pattern recognition but through the sequence of corners that are hit. This means that the full stroke path is unimportant and the recognition is highly deterministic, enabling better accuracy than other gestural alphabets. This input technique works well using a template with a square hole placed over a touch-sensitive surface, such as on a Personal Digital Assistant (PDA), and with a square boundary surrounding a joystick, which might be used on a cell-phone or game controller. Another feature of the input technique is that capital letters are made by ending the stroke in a particular corner, rather than through a mode change as in other gestural input techniques.
    Type: Grant
    Filed: March 29, 2004
    Date of Patent: June 1, 2010
    Assignee: Carnegie Mellon University
    Inventors: Jacob O. Wobbrock, Brad A. Myers
  • Publication number: 20100031203
    Abstract: The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.
    Type: Application
    Filed: June 24, 2009
    Publication date: February 4, 2010
    Applicant: Microsoft Corporation
    Inventors: Meredith J. Morris, Jacob O. Wobbrock, Andrew David Wilson
  • Publication number: 20100031202
    Abstract: The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.
    Type: Application
    Filed: August 4, 2008
    Publication date: February 4, 2010
    Applicant: Microsoft Corporation
    Inventors: Meredith J. Morris, Jacob O. Wobbrock, Andrew David Wilson
  • Patent number: 7519223
    Abstract: An interactive display table has a display surface for displaying images and upon or adjacent to which various objects, including a user's hand(s) and finger(s) can be detected. A video camera within the interactive display table responds to infrared (IR) light reflected from the objects to detect any connected components. Connected component correspond to portions of the object(s) that are either in contact, or proximate the display surface. Using these connected components, the interactive display table senses and infers natural hand or finger positions, or movement of an object, to detect gestures. Specific gestures are used to execute applications, carryout functions in an application, create a virtual object, or do other interactions, each of which is associated with a different gesture. A gesture can be a static pose, or a more complex configuration, and/or movement made with one or both hands or other objects.
    Type: Grant
    Filed: June 28, 2004
    Date of Patent: April 14, 2009
    Assignee: Microsoft Corporation
    Inventors: Joel P. Dehlin, Christina Summer Chen, Andrew D. Wilson, Daniel C. Robbins, Eric J. Horvitz, Kenneth P. Hinckley, Jacob O. Wobbrock
  • Publication number: 20040196256
    Abstract: A new unistroke text entry method for handheld or wearable devices is designed to provide high accuracy and stability of motion. The user makes characters by traversing the edges and diagonals of a geometric pattern, e.g. a square, imposed over the usual text input area. Gesture recognition is accomplished not through pattern recognition but through the sequence of corners that are hit. This means that the full stroke path is unimportant and the recognition is highly deterministic, enabling better accuracy than other gestural alphabets. This input technique works well using a template with a square hole placed over a touch-sensitive surface, such as on a Personal Digital Assistant (PDA), and with a square boundary surrounding a joystick, which might be used on a cell-phone or game controller. Another feature of the input technique is that capital letters are made by ending the stroke in a particular corner, rather than through a mode change as in other gestural input techniques.
    Type: Application
    Filed: March 29, 2004
    Publication date: October 7, 2004
    Inventors: Jacob O. Wobbrock, Brad A. Myers