Patents by Inventor Sharath Viswanathan
Sharath Viswanathan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230244352Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.Type: ApplicationFiled: March 30, 2023Publication date: August 3, 2023Inventors: Elizabeth Picchietti SALOWITZ, Joseph Spencer KING, Nan YANG, Albert Peter YIH, Sharath VISWANATHAN
-
Patent number: 11620030Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.Type: GrantFiled: May 4, 2021Date of Patent: April 4, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Elizabeth Picchietti Salowitz, Joseph Spencer King, Nan Yang, Albert Peter Yih, Sharath Viswanathan
-
Publication number: 20220357817Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.Type: ApplicationFiled: May 4, 2021Publication date: November 10, 2022Inventors: Elizabeth Picchietti SALOWITZ, Joseph Spencer KING, Nan YANG, Albert Peter YIH, Sharath VISWANATHAN
-
Patent number: 10775997Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: GrantFiled: April 25, 2017Date of Patent: September 15, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Publication number: 20170228150Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: ApplicationFiled: April 25, 2017Publication date: August 10, 2017Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 9645651Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: GrantFiled: September 24, 2013Date of Patent: May 9, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 9501218Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.Type: GrantFiled: January 10, 2014Date of Patent: November 22, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
-
Publication number: 20150199101Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.Type: ApplicationFiled: January 10, 2014Publication date: July 16, 2015Applicant: Microsoft CorporationInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
-
Publication number: 20150089419Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: ApplicationFiled: September 24, 2013Publication date: March 26, 2015Applicant: Microsoft CorporationInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 8943092Abstract: Disclosed herein are representative embodiments of tools and techniques for performing contextual searches using text determined based on digital-ink data. According to one exemplary technique, digital-ink data is received at a computing device and text is determined based on the digital-ink data. Additionally, by an application of the computing device, a contextual search is performed using the text.Type: GrantFiled: March 4, 2013Date of Patent: January 27, 2015Assignee: Microsoft CorporationInventors: Lynn Dai, Daniel J. Hwang, Zafeiria Anagnostopoulou, Benjamin Westbrook, Peter Gregory Davis, Sharath Viswanathan
-
Publication number: 20140354553Abstract: Techniques are described for automatically determining a touch input mode for a computing device. The computing device can detect whether touch is being performed by a user's finger or by an object. The computing device can then enable a different interaction model depending on whether a finger or an object is detected. For example, the computing device can automatically switch to a finger touch input mode when touch input is detected using the user's finger, and automatically switch to an object touch input mode when touch input is detected using an object. The finger touch input mode can perform user interface manipulation. The object touch input mode can perform input using digital ink. Different feedback models can be provided depending on which touch input mode is currently being used.Type: ApplicationFiled: May 29, 2013Publication date: December 4, 2014Applicant: MICROSOFT CORPORATIONInventors: Juan Dai, Daniel J. Hwang, Wenqi Shen, Sharath Viswanathan, Pu Li
-
Publication number: 20140337804Abstract: Techniques are described for automatically performing application-specific actions based on global pre-defined symbols entered using digital ink. For example, a computing device supporting digital ink input can receive digital ink content from a user (e.g., via a digitizer and/or touchscreen), process the digital ink input to recognize text and/or graphical content, determine whether global pre-defined symbols are present in the recognized text and/or graphical content, and perform application-specific actions associated with the global pre-defined symbols that are present. The application-specific actions can be associated with built-in and/or third-party applications.Type: ApplicationFiled: May 10, 2013Publication date: November 13, 2014Inventors: Daniel J. Hwang, Juan Dai, Wenqi Shen, Sharath Viswanathan, Pu Li
-
Publication number: 20140267094Abstract: Techniques are described herein that are capable of performing an action on a touch-enabled device based on a gesture. A gesture (e.g., a hover gesture, a gaze gesture, a look-and-blink gesture, a voice gesture, a touch gesture, etc.) can be detected and an action performed in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers, palm, etc. are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: ApplicationFiled: June 14, 2013Publication date: September 18, 2014Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez
-
Publication number: 20140267130Abstract: Various embodiments herein provide for a method of receiving user input on a touch screen. A hover gesture can be detected and an action performed in response to the detection. The hover gesture can occur without a user physically touching a touch screen. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering.Type: ApplicationFiled: March 13, 2013Publication date: September 18, 2014Applicant: Microsoft CorporationInventors: Daniel J. Hwang, Sharath Viswanathan, Wenqi Shen, Lynn Dai
-
Publication number: 20140250143Abstract: Disclosed herein are representative embodiments of tools and techniques for performing contextual searches using text determined based on digital-ink data. According to one exemplary technique, digital-ink data is received at a computing device and text is determined based on the digital-ink data. Additionally, by an application of the computing device, a contextual search is performed using the text.Type: ApplicationFiled: March 4, 2013Publication date: September 4, 2014Applicant: Microsoft CorporationInventors: Lynn Dai, Daniel J. Hwang, Zafeiria Anagnostopoulou, Benjamin Westbrook, Peter Gregory Davis, Sharath Viswanathan
-
Patent number: 7930760Abstract: This disclosure describes techniques of using a centralized rule database to control the abilities of software processes to perform actions with regard to resources provided by a computer. As described herein, each software process executing in a computer executes within a chamber and each resource provided by the computer is associated with a canonical name that uniquely identifies the resource. Furthermore, the computer stores a set of security rules in a centralized rule database. In addition, this disclosure describes techniques of enforcing the rules stored in the centralized rule database.Type: GrantFiled: June 27, 2008Date of Patent: April 19, 2011Assignee: Microsoft CorporationInventors: Neil Coles, Yadhu Gopalan, Christopher Jordan, Matthew Lyons, Andrew Rogers, Upender Sandadi, Scott Shell, Zoheb Vacheri, Angelo Vals, Sharath Viswanathan, Loren M. Kohnfelder
-
Publication number: 20090249436Abstract: This disclosure describes techniques of using a centralized rule database to control the abilities of software processes to perform actions with regard to resources provided by a computer. As described herein, each software process executing in a computer executes within a chamber and each resource provided by the computer is associated with a canonical name that uniquely identifies the resource. Furthermore, the computer stores a set of security rules in a centralized rule database. In addition, this disclosure describes techniques of enforcing the rules stored in the centralized rule database.Type: ApplicationFiled: June 27, 2008Publication date: October 1, 2009Applicant: MICROSOFT CORPORATIONInventors: Neil Coles, Yadhu Gopalan, Christopher Jordan, Matthew Lyons, Andrew Rogers, Upender Sandadi, Scott Shell, Zoheb Vacheri, Angelo Vals, Sharath Viswanathan, Loren M. Kohnfelder