Patents by Inventor Juan (Lynn) Dai

Juan (Lynn) Dai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10775997
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Grant
    Filed: April 25, 2017
    Date of Patent: September 15, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Publication number: 20170228150
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: April 25, 2017
    Publication date: August 10, 2017
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Patent number: 9645651
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Grant
    Filed: September 24, 2013
    Date of Patent: May 9, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Patent number: 9501218
    Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.
    Type: Grant
    Filed: January 10, 2014
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
  • Publication number: 20150199101
    Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.
    Type: Application
    Filed: January 10, 2014
    Publication date: July 16, 2015
    Applicant: Microsoft Corporation
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
  • Publication number: 20150089419
    Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: September 24, 2013
    Publication date: March 26, 2015
    Applicant: Microsoft Corporation
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
  • Publication number: 20140267094
    Abstract: Techniques are described herein that are capable of performing an action on a touch-enabled device based on a gesture. A gesture (e.g., a hover gesture, a gaze gesture, a look-and-blink gesture, a voice gesture, a touch gesture, etc.) can be detected and an action performed in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers, palm, etc. are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.
    Type: Application
    Filed: June 14, 2013
    Publication date: September 18, 2014
    Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez