Patents by Inventor Koji Yatani
Koji Yatani has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 8239785Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.Type: GrantFiled: January 27, 2010Date of Patent: August 7, 2012Assignee: Microsoft CorporationInventors: Kenneth P. Hinckley, Koji Yatani, Georg F. Petschnigg
-
Patent number: 8122384Abstract: One embodiment of the present invention provides a system that facilitates invoking a command. During operation, the system suggests with a graphic element a gesture to use to invoke a command. The system then receives the gesture from a user at a device. Note that the gesture is received via an input mechanism, and also note that the gesture is a predetermined manipulation of the input mechanism. The system then determines a graphic element within the user interface that is associated with the gesture. Finally, upon determining the object associated with the gesture, the system invokes the command associated with the graphic element.Type: GrantFiled: September 18, 2007Date of Patent: February 21, 2012Assignee: Palo Alto Research Center IncorporatedInventors: Kurt E. Partridge, Koji Yatani, Mark W. Newman, David Goldberg
-
Publication number: 20110209058Abstract: Embodiments of a multi-screen hold and tap gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system, and the hold input is recognized when held to select a displayed object on the first screen. A tap input is recognized at a second screen of the multi-screen system, and the tap input is recognized when the displayed object continues being selected. A hold and tap gesture can then be determined from the recognized hold and tap inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209103Abstract: Embodiments of a multi-screen hold and drag gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system when the hold input is held in place. A motion input is recognized at a second screen of the multi-screen system, and the motion input is recognized to select a displayed object while the hold input remains held in place. A hold and drag gesture can then be determined from the recognized hold and motion inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Inventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209102Abstract: Embodiments of a multi-screen dual tap gesture are described. In various embodiments, a first tap input to a displayed object is recognized at a first screen of a multi-screen system. A second tap input to the displayed object is recognized at a second screen of the multi-screen system, and the second tap input is recognized approximately when the first tap input is recognized. A dual tap gesture can then be determined from the recognized first and second tap inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209100Abstract: Embodiments of multi-screen pinch and expand gestures are described. In various embodiments, a first input is recognized at a first screen of a multi-screen system, and the first input includes a first motion input. A second input is recognized at a second screen of the multi-screen system, and the second input includes a second motion input. A pinch gesture or an expand gesture can then be determined from the first and second motion inputs that are associated with the recognized first and second inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209099Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209104Abstract: Embodiments of a multi-screen synchronous slide gesture are described. In various embodiments, a first motion input is recognized at a first screen of a multi-screen system, and the first motion input is recognized when moving in a particular direction across the first screen. A second motion input is recognized at a second screen of the multi-screen system, where the second motion input is recognized when moving in the particular direction across the second screen and approximately when the first motion input is recognized. A synchronous slide gesture can then be determined from the recognized first and second motion inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209101Abstract: Embodiments of a multi-screen pinch-to-pocket gesture are described. In various embodiments, a first motion input to a first screen region is recognized at a first screen of a multi-screen system, and the first motion input is recognized to select a displayed object. A second motion input to a second screen region is recognized at a second screen of the multi-screen system, and the second motion input is recognized to select the displayed object. A pinch-to-pocket gesture can then be determined from the recognized first and second motion inputs within the respective first and second screen regions, the pinch-to-pocket gesture effective to pocket the displayed object.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Inventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209089Abstract: Embodiments of a multi-screen object-hold and page-change gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system, and the hold input is recognized when held in place to select a displayed object on the first screen. A motion input is recognized at a second screen of the multi-screen system, where the motion input is recognized while the displayed object remains held in place and is effective to change one or more journal pages. An object-hold and page-change gesture can then be determined from the recognized hold and motion inputs.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Inventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209098Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Inventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209039Abstract: Embodiments of a multi-screen bookmark hold gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system, and the hold input is recognized when held in place proximate an edge of a journal page that is displayed on the first screen. A motion input is recognized at a second screen of the multi-screen system while the hold input remains held in place. A bookmark hold gesture can then be determined from the recognized hold and motion inputs, and the bookmark hold gesture is effective to bookmark the journal page at a location of the hold input on the first screen.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209093Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209088Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110205163Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209057Abstract: Embodiments of a multi-screen hold and page-flip gesture are described. In various embodiments, a hold input is recognized at a first screen of a multi-screen system, and the hold input is recognized when held to select a journal page that is displayed on the first screen. A motion input is recognized at a second screen of the multi-screen system, and the motion input is recognized while the hold input remains held in place. A hold and page-flip gesture can then be determined from the recognized hold and motion inputs, and the hold and page-flip gesture is effective to maintain the display of the journal page while one or more additional journal pages are flipped for display on the second screen.Type: ApplicationFiled: February 25, 2010Publication date: August 25, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110209097Abstract: Bezel gestures for touch displays are described. In at least some embodiments, the bezel of a device is used to extend functionality that is accessible through the use of so-called bezel gestures. In at least some embodiments, off-screen motion can be used, by virtue of the bezel, to create screen input through a bezel gesture. Bezel gestures can include single-finger bezel gestures, multiple-finger/same-hand bezel gestures, and/or multiple-finger, different-hand bezel gestures.Type: ApplicationFiled: February 19, 2010Publication date: August 25, 2011Inventors: Kenneth P. Hinckley, Koji Yatani
-
Publication number: 20110191719Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.Type: ApplicationFiled: February 4, 2010Publication date: August 4, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani, Andrew S. Allen, Jonathan R. Harris, Georg F. Petschnigg, Michel Pahud
-
Publication number: 20110191704Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.Type: ApplicationFiled: February 4, 2010Publication date: August 4, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani, Jonathan R. Harris, Andrew S. Allen, Georg F. Petschnigg, Michel Pahud
-
Publication number: 20110191718Abstract: Techniques involving gestures and other functionality are described. In one or more implementations, the techniques describe gestures that are usable to provide inputs to a computing device. A variety of different gestures are contemplated, including bimodal gestures (e.g., using more than one type of input) and single modal gestures. Additionally, the gesture techniques may be configured to leverage these different input types to increase the amount of gestures that are made available to initiate operations of a computing device.Type: ApplicationFiled: February 4, 2010Publication date: August 4, 2011Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Koji Yatani, Andrew S. Allen, Jonathan R. Harris, Georg F. Petschnigg