Patents by Inventor Jacob O. Wobbrock
Jacob O. Wobbrock has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10540083Abstract: A system for classifying a user touch event by a user interacting with a device as an intended key is provided. For different hand postures (e.g., holding device with right hand and entering text with right thumb), the system provides a touch pattern model indicating how the user interacts using that hand posture. The system receives an indication of a user touch event and identifies the hand posture of the user. The system then determines the intended key based on the user touch event and a touch pattern model for the identified hand posture. A system is also provided for determining the amount a presser a user is applying to the device based on dampening of vibrations as measured by an inertial sensor. A system is provided that uses motion of the device as measured by an inertial sensor to improve the accuracy of text entry.Type: GrantFiled: December 11, 2013Date of Patent: January 21, 2020Assignee: University of WashingtonInventors: Mayank Goel, Jacob O. Wobbrock, Shwetak N. Patel, Leah Findlater
-
Publication number: 20180046704Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.Type: ApplicationFiled: October 2, 2017Publication date: February 15, 2018Inventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
-
Patent number: 9811583Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.Type: GrantFiled: June 18, 2012Date of Patent: November 7, 2017Assignee: University of Washington Through Its Center for CommercializationInventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
-
Publication number: 20150317076Abstract: A system for classifying a user touch event by a user interacting with a device as an intended key is provided. For different hand postures (e.g., holding device with right hand and entering text with right thumb), the system provides a touch pattern model indicating how the user interacts using that hand posture. The system receives an indication of a user touch event and identifies the hand posture of the user. The system then determines the intended key based on the user touch event and a touch pattern model for the identified hand posture. A system is also provided for determining the amount a presser a user is applying to the device based on dampening of vibrations as measured by an inertial sensor. A system is provided that uses motion of the device as measured by an inertial sensor to improve the accuracy of text entry.Type: ApplicationFiled: December 11, 2013Publication date: November 5, 2015Inventors: Mayank Goel, Jacob O. Wobbrock, Shwetak N. Patel, Leah Findlater
-
Publication number: 20140149432Abstract: Described herein are systems and methods for selection-based contextual help retrieval. One example method involves (a) receiving first-query data including contextual data, the contextual data indicating a user-interface element type, a user-interface element location, and user-interface element text; (b) determining at least one first-query response based on at least the contextual data; and (c) causing an indication of the determined at least one first-query response to be provided via an output device.Type: ApplicationFiled: June 18, 2012Publication date: May 29, 2014Applicant: UNIVERSITY OF WASHINGTON THROUGH ITS CENTER FOR COMMERCIALIZATIONInventors: Parmit K. Chilana, Andrew J. Ko, Jacob O. Wobbrock
-
Patent number: 7729542Abstract: A new unistroke text entry method for handheld or wearable devices is designed to provide high accuracy and stability of motion. The user makes characters by traversing the edges and diagonals of a geometric pattern, e.g. a square, imposed over the usual text input area. Gesture recognition is accomplished not through pattern recognition but through the sequence of corners that are hit. This means that the full stroke path is unimportant and the recognition is highly deterministic, enabling better accuracy than other gestural alphabets. This input technique works well using a template with a square hole placed over a touch-sensitive surface, such as on a Personal Digital Assistant (PDA), and with a square boundary surrounding a joystick, which might be used on a cell-phone or game controller. Another feature of the input technique is that capital letters are made by ending the stroke in a particular corner, rather than through a mode change as in other gestural input techniques.Type: GrantFiled: March 29, 2004Date of Patent: June 1, 2010Assignee: Carnegie Mellon UniversityInventors: Jacob O. Wobbrock, Brad A. Myers
-
Publication number: 20100031203Abstract: The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.Type: ApplicationFiled: June 24, 2009Publication date: February 4, 2010Applicant: Microsoft CorporationInventors: Meredith J. Morris, Jacob O. Wobbrock, Andrew David Wilson
-
Publication number: 20100031202Abstract: The claimed subject matter provides a system and/or a method that facilitates generating an intuitive set of gestures for employment with surface computing. A gesture set creator can prompt two or more users with a potential effect for a portion of displayed data. An interface component can receive at least one surface input from the user in response to the prompted potential effect. A surface detection component can track the surface input utilizing a computer vision-based sensing technique. The gesture set creator collects the surface input from the two or more users in order to identify a user-defined gesture based upon a correlation between the respective surface inputs, wherein the user-defined gesture is defined as an input that initiates the potential effect for the portion of displayed data.Type: ApplicationFiled: August 4, 2008Publication date: February 4, 2010Applicant: Microsoft CorporationInventors: Meredith J. Morris, Jacob O. Wobbrock, Andrew David Wilson
-
Patent number: 7519223Abstract: An interactive display table has a display surface for displaying images and upon or adjacent to which various objects, including a user's hand(s) and finger(s) can be detected. A video camera within the interactive display table responds to infrared (IR) light reflected from the objects to detect any connected components. Connected component correspond to portions of the object(s) that are either in contact, or proximate the display surface. Using these connected components, the interactive display table senses and infers natural hand or finger positions, or movement of an object, to detect gestures. Specific gestures are used to execute applications, carryout functions in an application, create a virtual object, or do other interactions, each of which is associated with a different gesture. A gesture can be a static pose, or a more complex configuration, and/or movement made with one or both hands or other objects.Type: GrantFiled: June 28, 2004Date of Patent: April 14, 2009Assignee: Microsoft CorporationInventors: Joel P. Dehlin, Christina Summer Chen, Andrew D. Wilson, Daniel C. Robbins, Eric J. Horvitz, Kenneth P. Hinckley, Jacob O. Wobbrock
-
Publication number: 20040196256Abstract: A new unistroke text entry method for handheld or wearable devices is designed to provide high accuracy and stability of motion. The user makes characters by traversing the edges and diagonals of a geometric pattern, e.g. a square, imposed over the usual text input area. Gesture recognition is accomplished not through pattern recognition but through the sequence of corners that are hit. This means that the full stroke path is unimportant and the recognition is highly deterministic, enabling better accuracy than other gestural alphabets. This input technique works well using a template with a square hole placed over a touch-sensitive surface, such as on a Personal Digital Assistant (PDA), and with a square boundary surrounding a joystick, which might be used on a cell-phone or game controller. Another feature of the input technique is that capital letters are made by ending the stroke in a particular corner, rather than through a mode change as in other gestural input techniques.Type: ApplicationFiled: March 29, 2004Publication date: October 7, 2004Inventors: Jacob O. Wobbrock, Brad A. Myers