Patents by Inventor Lynn Dai
Lynn Dai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10775997Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: GrantFiled: April 25, 2017Date of Patent: September 15, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 10521105Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: GrantFiled: June 14, 2018Date of Patent: December 31, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang
-
Publication number: 20190138178Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: ApplicationFiled: June 14, 2018Publication date: May 9, 2019Inventors: Lynn DAI, Dan HWANG
-
Patent number: 10120568Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: GrantFiled: October 8, 2015Date of Patent: November 6, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Patent number: 10025489Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: GrantFiled: September 16, 2013Date of Patent: July 17, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang
-
Publication number: 20170228150Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: ApplicationFiled: April 25, 2017Publication date: August 10, 2017Inventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 9645651Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: GrantFiled: September 24, 2013Date of Patent: May 9, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Patent number: 9501218Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.Type: GrantFiled: January 10, 2014Date of Patent: November 22, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
-
Patent number: 9262012Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.Type: GrantFiled: January 3, 2014Date of Patent: February 16, 2016Inventors: Dan Hwang, Lynn Dai, Muhammad Usman
-
Publication number: 20160026385Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: ApplicationFiled: October 8, 2015Publication date: January 28, 2016Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Patent number: 9170736Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: GrantFiled: September 16, 2013Date of Patent: October 27, 2015Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Publication number: 20150231491Abstract: Example apparatus and methods provide a virtual control for a video game played on a hover-sensitive device. A method may establish a hover point for an object located in a hover space produced by the device. The hover point may be associated with a three dimensional virtual joystick that may or may not be displayed. The virtual joystick processes inputs from the three dimensional hover space. The inputs have a z component. The z component may characterize, for example, a distance between the object and the device, or a rate at which the object is approaching or moving away from the device. The video game may be controlled in a third dimension based on the z component. For example, a character may crouch down or stand up based on the z component, or the area of a spell may be expanded or contracted based on the z component.Type: ApplicationFiled: February 19, 2014Publication date: August 20, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Publication number: 20150234468Abstract: Example apparatus and methods support interactions between a hover-sensitive apparatus and other apparatus. A hover action performed in the hover space of one apparatus can control that apparatus or another apparatus. The interactions may depend on the positions of the apparatus. For example, a user may virtually pick up an item on a first hover-sensitive apparatus and virtually toss it to another apparatus using a hover gesture. A directional gesture may selectively send content to a target apparatus while a directionless gesture may send content to a distribution list or to any apparatus in range. A shared display may be produced for multiple interconnected devices and coordinated information may be presented on the shared display. For example, a chessboard that spans two smartphones may be displayed and a hover gesture may virtually lift a chess piece from one of the displays and deposit it on another of the displays.Type: ApplicationFiled: February 19, 2014Publication date: August 20, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Publication number: 20150199101Abstract: Techniques are described herein that are capable of increasing touch and/or hover accuracy on a touch-enabled device. For example, attribute(s) of a hand or a portion thereof (e.g., one or more fingers) may be used to determine a location on a touch screen to which a user intends to point. Such attribute(s) may be derived, measured, etc. For instance, a value corresponding to a distance between the hand/portion and the touch screen may be derived from a magnitude of a measurement of an interaction between the hand/portion and the touch screen. In another example, virtual elements displayed on the touch screen may be mapped to respective areas in a plane that is parallel (e.g., coincident) with the touch screen. In accordance with this example, receiving a touch and/or hover command with regard to an area in the plane may indicate selection of the corresponding virtual element.Type: ApplicationFiled: January 10, 2014Publication date: July 16, 2015Applicant: Microsoft CorporationInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan
-
Publication number: 20150193040Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.Type: ApplicationFiled: January 3, 2014Publication date: July 9, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai, Muhammad Usman
-
Publication number: 20150089419Abstract: Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection. Alternatively, absence of a motion can be detected and the control interface presented in response to the detection. A hover gesture can occur without a user physically touching a touch screen of a touch-enabled device. Instead, the user's finger or fingers can be positioned at a spaced distance above the touch screen. The touch screen can detect that the user's fingers are proximate to the touch screen, such as through capacitive sensing. Additionally, finger movement can be detected while the fingers are hovering to expand the existing options for gesture input.Type: ApplicationFiled: September 24, 2013Publication date: March 26, 2015Applicant: Microsoft CorporationInventors: Daniel J. Hwang, Juan (Lynn) Dai, Sharath Viswanathan, Joseph B. Tobens, Jose A. Rodriguez, Peter G. Davis
-
Publication number: 20150077338Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: MICROSOFT CORPORATIONInventors: Lynn Dai, Dan Hwang
-
Publication number: 20150082216Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: Microsoft CorporationInventors: Lynn Dai, Dan Hwang, Paul Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Publication number: 20150077345Abstract: Example apparatus and methods concern processing simultaneous touch and hover actions for a touch-sensitive and hover-sensitive input/output (i/o) interface. One example apparatus includes a touch detector that detects an object that touches the i/o interface. The example apparatus includes a proximity detector that detects an object in a hover-space associated with the i/o interface. The apparatus produces characterization data concerning the touch action and the hover action. The proximity detector and the touch detector may share a set of capacitive sensing nodes. Example apparatus and methods selectively control input/output actions on the i/o interface based on a combination of the touch action(s) and the hover action(s).Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Patent number: 8943092Abstract: Disclosed herein are representative embodiments of tools and techniques for performing contextual searches using text determined based on digital-ink data. According to one exemplary technique, digital-ink data is received at a computing device and text is determined based on the digital-ink data. Additionally, by an application of the computing device, a contextual search is performed using the text.Type: GrantFiled: March 4, 2013Date of Patent: January 27, 2015Assignee: Microsoft CorporationInventors: Lynn Dai, Daniel J. Hwang, Zafeiria Anagnostopoulou, Benjamin Westbrook, Peter Gregory Davis, Sharath Viswanathan