Patents by Inventor Sharath Viswanathan

Sharath Viswanathan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11899898
    Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
    Type: Grant
    Filed: March 30, 2023
    Date of Patent: February 13, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Elizabeth Picchietti Salowitz, Joseph Spencer King, Nan Yang, Albert Peter Yih, Sharath Viswanathan
  • Publication number: 20230244352
    Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
    Type: Application
    Filed: March 30, 2023
    Publication date: August 3, 2023
    Inventors: Elizabeth Picchietti SALOWITZ, Joseph Spencer KING, Nan YANG, Albert Peter YIH, Sharath VISWANATHAN
  • Patent number: 11620030
    Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
    Type: Grant
    Filed: May 4, 2021
    Date of Patent: April 4, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Elizabeth Picchietti Salowitz, Joseph Spencer King, Nan Yang, Albert Peter Yih, Sharath Viswanathan
  • Publication number: 20220357817
    Abstract: In non-limiting examples of the present disclosure, systems, methods and devices for executing gesture operations are provided. A touchpad gesture manager and a touchscreen gesture manager may be maintained. Both managers may comprise the identities of gesture operations and conditions for executing the gesture operations. The conditions for one or more touchscreen gesture operations may be the same as the conditions for one or more corresponding touchpad gesture operations. The gestures that have same conditions for the touchscreen and the touchpad may comprise application window operations and virtual desktop transition operations. In some examples, one or more display elements, animations, or intermediate operations may be different in executing the touchscreen operations than for executing the touchpad operations.
    Type: Application
    Filed: May 4, 2021
    Publication date: November 10, 2022
    Inventors: Elizabeth Picchietti SALOWITZ, Joseph Spencer KING, Nan YANG, Albert Peter YIH, Sharath VISWANATHAN
  • Patent number: 10775997
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Grant
    Filed: April 25, 2017
    Date of Patent: September 15, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Publication number: 20170228150
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: April 25, 2017
    Publication date: August 10, 2017
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Patent number: 9645651
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Grant
    Filed: September 24, 2013
    Date of Patent: May 9, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Patent number: 9501218
    Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.
    Type: Grant
    Filed: January 10, 2014
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
  • Publication number: 20150199101
    Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.
    Type: Application
    Filed: January 10, 2014
    Publication date: July 16, 2015
    Applicant: Microsoft Corporation
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
  • Publication number: 20150089419
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: September 24, 2013
    Publication date: March 26, 2015
    Applicant: Microsoft Corporation
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Patent number: 8943092
    Abstract: Disclosed herein are representative embodiments of tools and techniques for performing contextual searches using text determined based on digital-ink data. According to one exemplary technique, digital-ink data is received at a computing device and text is determined based on the digital-ink data. Additionally, by an application of the computing device, a contextual search is performed using the text.
    Type: Grant
    Filed: March 4, 2013
    Date of Patent: January 27, 2015
    Assignee: Microsoft Corporation
    Inventors: Lynn Dai, Daniel J. Hwang, Zafeiria Anagnostopoulou, Benjamin Westbrook, Peter Gregory Davis, Sharath Viswanathan
  • Publication number: 20140354553
    Abstract: Techniques are described for automatically determining a touch input mode for a computing device. The computing device can detect whether touch is being performed by a user's finger or by an object. The computing device can then enable a different interaction model depending on whether a finger or an object is detected. For example, the computing device can automatically switch to a finger touch input mode when touch input is detected using the user's finger, and automatically switch to an object touch input mode when touch input is detected using an object. The finger touch input mode can perform user interface manipulation. The object touch input mode can perform input using digital ink. Different feedback models can be provided depending on which touch input mode is currently being used.
    Type: Application
    Filed: May 29, 2013
    Publication date: December 4, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Juan Dai, Daniel J. Hwang, Wenqi Shen, Sharath Viswanathan, Pu Li
  • Publication number: 20140337804
    Abstract: Techniques are described for automatically performing application-specific actions based on global pre-defined symbols entered using digital ink. For example, a computing device supporting digital ink input can receive digital ink content from a user (e.g., via a digitizer and/or touchscreen), process the digital ink input to recognize text and/or graphical content, determine whether global pre-defined symbols are present in the recognized text and/or graphical content, and perform application-specific actions associated with the global pre-defined symbols that are present. The application-specific actions can be associated with built-in and/or third-party applications.
    Type: Application
    Filed: May 10, 2013
    Publication date: November 13, 2014
    Inventors: Daniel J. Hwang, Juan Dai, Wenqi Shen, Sharath Viswanathan, Pu Li
  • Publication number: 20140267130
    Abstract: Various embodiments herein provide for a method of receiving user input on a touch screen. A hover gesture can be detected and an action performed in response to the detection. The hover gesture can occur without a user physically touching a touch screen. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Applicant: Microsoft Corporation
    Inventors: Daniel J. Hwang, Sharath Viswanathan, Wenqi Shen, Lynn Dai
  • Publication number: 20140267094
    Abstract: Techniques are described herein that are capable of performing an action on a touch-enabled device based on a gesture. A gesture (e.g., a hover gesture, a gaze gesture, a look-and-blink gesture, a voice gesture, a touch gesture, etc.) can be detected and an action performed in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers, palm, etc. are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: June 14, 2013
    Publication date: September 18, 2014
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez
  • Publication number: 20140250143
    Abstract: Disclosed herein are representative embodiments of tools and techniques for performing contextual searches using text determined based on digital-ink data. According to one exemplary technique, digital-ink data is received at a computing device and text is determined based on the digital-ink data. Additionally, by an application of the computing device, a contextual search is performed using the text.
    Type: Application
    Filed: March 4, 2013
    Publication date: September 4, 2014
    Applicant: Microsoft Corporation
    Inventors: Lynn Dai, Daniel J. Hwang, Zafeiria Anagnostopoulou, Benjamin Westbrook, Peter Gregory Davis, Sharath Viswanathan
  • Patent number: 7930760
    Abstract: This disclosure describes techniques of using a centralized rule database to control the abilities of software processes to perform actions with regard to resources provided by a computer. As described herein, each software process executing in a computer executes within a chamber and each resource provided by the computer is associated with a canonical name that uniquely identifies the resource. Furthermore, the computer stores a set of security rules in a centralized rule database. In addition, this disclosure describes techniques of enforcing the rules stored in the centralized rule database.
    Type: Grant
    Filed: June 27, 2008
    Date of Patent: April 19, 2011
    Assignee: Microsoft Corporation
    Inventors: Neil Coles, Yadhu Gopalan, Christopher Jordan, Matthew Lyons, Andrew Rogers, Upender Sandadi, Scott Shell, Zoheb Vacheri, Angelo Vals, Sharath Viswanathan, Loren M. Kohnfelder
  • Publication number: 20090249436
    Abstract: This disclosure describes techniques of using a centralized rule database to control the abilities of software processes to perform actions with regard to resources provided by a computer. As described herein, each software process executing in a computer executes within a chamber and each resource provided by the computer is associated with a canonical name that uniquely identifies the resource. Furthermore, the computer stores a set of security rules in a centralized rule database. In addition, this disclosure describes techniques of enforcing the rules stored in the centralized rule database.
    Type: Application
    Filed: June 27, 2008
    Publication date: October 1, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Neil Coles, Yadhu Gopalan, Christopher Jordan, Matthew Lyons, Andrew Rogers, Upender Sandadi, Scott Shell, Zoheb Vacheri, Angelo Vals, Sharath Viswanathan, Loren M. Kohnfelder