Patents by Inventor Dan Hwang
Dan Hwang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10521105Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: GrantFiled: June 14, 2018Date of Patent: December 31, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang
-
Publication number: 20190138178Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: ApplicationFiled: June 14, 2018Publication date: May 9, 2019Inventors: Lynn DAI, Dan HWANG
-
Patent number: 10120568Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: GrantFiled: October 8, 2015Date of Patent: November 6, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Patent number: 10025489Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: GrantFiled: September 16, 2013Date of Patent: July 17, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Lynn Dai, Dan Hwang
-
Patent number: 9262012Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.Type: GrantFiled: January 3, 2014Date of Patent: February 16, 2016Inventors: Dan Hwang, Lynn Dai, Muhammad Usman
-
Publication number: 20160026385Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: ApplicationFiled: October 8, 2015Publication date: January 28, 2016Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Patent number: 9170736Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: GrantFiled: September 16, 2013Date of Patent: October 27, 2015Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Publication number: 20150231491Abstract: Example apparatus and methods provide a virtual control for a video game played on a hover-sensitive device. A method may establish a hover point for an object located in a hover space produced by the device. The hover point may be associated with a three dimensional virtual joystick that may or may not be displayed. The virtual joystick processes inputs from the three dimensional hover space. The inputs have a z component. The z component may characterize, for example, a distance between the object and the device, or a rate at which the object is approaching or moving away from the device. The video game may be controlled in a third dimension based on the z component. For example, a character may crouch down or stand up based on the z component, or the area of a spell may be expanded or contracted based on the z component.Type: ApplicationFiled: February 19, 2014Publication date: August 20, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Publication number: 20150234468Abstract: Example apparatus and methods support interactions between a hover-sensitive apparatus and other apparatus. A hover action performed in the hover space of one apparatus can control that apparatus or another apparatus. The interactions may depend on the positions of the apparatus. For example, a user may virtually pick up an item on a first hover-sensitive apparatus and virtually toss it to another apparatus using a hover gesture. A directional gesture may selectively send content to a target apparatus while a directionless gesture may send content to a distribution list or to any apparatus in range. A shared display may be produced for multiple interconnected devices and coordinated information may be presented on the shared display. For example, a chessboard that spans two smartphones may be displayed and a hover gesture may virtually lift a chess piece from one of the displays and deposit it on another of the displays.Type: ApplicationFiled: February 19, 2014Publication date: August 20, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Publication number: 20150205400Abstract: Example apparatus and methods detect how a portable (e.g., handheld) device (e.g., phone, tablet) is gripped (e.g., held, supported). Detecting the grip may include detecting and characterizing touch points for fingers, thumbs, palms, or surfaces that are involved in supporting and positioning the apparatus. Example apparatus and methods may determine whether and how an apparatus is being held and then may exercise control based on the grip detection. For example, a display on an input/output interface may be reconfigured, physical controls (e.g., push buttons) on the apparatus may be remapped, user interface elements may be repositioned, resized, or repurposed, portions of the input/output interface may be desensitized or hyper-sensitized, virtual controls may be remapped, or other actions may be taken. Touch sensors may detect the pressure with which a smart phone is being gripped and produce control events (e.g., on/off, louder/quieter, brighter/dimmer, press and hold) based on the pressure.Type: ApplicationFiled: January 21, 2014Publication date: July 23, 2015Applicant: MICROSOFT CORPORATIONInventors: Dan Hwang, Muhammad Usman, Scott Greenlay, Moshe Sapir
-
Publication number: 20150199030Abstract: Example apparatus and methods concern a first device (e.g., phone, tablet) having a touch and hover-sensitive display. The first device may detect a second device (e.g., television, monitor) that has a second display. After establishing a communication link and a context between the first and second device, the first device may provide a first output (e.g., movie, game) to be displayed on the second device. In response to identifying a hover point produced in a hover space associated with the first device, the first device may provide a second output (e.g., user interface element, cursor) for display on the second display. The second output may be based on the context and on a hover action associated with the hover point. The user may then cause a control event to be generated by interacting with the second display using the second output in relation to the cursor.Type: ApplicationFiled: January 10, 2014Publication date: July 16, 2015Applicant: Microsoft CorporationInventors: Petteri Mikkola, Dan Hwang
-
Publication number: 20150193040Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.Type: ApplicationFiled: January 3, 2014Publication date: July 9, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai, Muhammad Usman
-
Publication number: 20150177866Abstract: Example apparatus and methods concern detecting and responding to a multiple hover point gesture performed for a hover-sensitive device. An example apparatus may include a hover-sensitive input/output interface configured to detect multiple objects in a hover-space associated with the hover-sensitive input/output interface. The apparatus may include logics configured to identify an object in the hover-space, to characterize an object in the hover-space, to track an object in the hover-space, to identify a multiple hover point gesture based on the identification, characterization, and tracking, and to control a device, application, interface, or object based on the multiple hover point gesture. In different embodiments, multiple hover point gestures may be performed in one, two, three, or four dimensions. In one embodiment, the apparatus may be event driven with respect to handling gestures.Type: ApplicationFiled: December 23, 2013Publication date: June 25, 2015Inventors: Dan Hwang, Scott Greenlay, Christopher Fellowes, Bob Schriver
-
Publication number: 20150160819Abstract: Example apparatus and methods concern detecting and responding to a crane gesture performed for a touch or hover-sensitive device. An example apparatus may include a hover-sensitive input/output interface configured to display an object that can be manipulated using a crane gesture. The apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The apparatus may include fogies configured to change a state of the object from untouched to target to pinched to lifted to released in response to detecting the appearance and movement of bracket points. The appearance of the object may change in response to detecting the state changes.Type: ApplicationFiled: December 6, 2013Publication date: June 11, 2015Inventors: Dan Hwang, Scott Greenlay, Christopher Fellows, Thamer Abanami, Jose Rodriguez, Joe Tobens
-
Publication number: 20150077345Abstract: Example apparatus and methods concern processing simultaneous touch and hover actions for a touch-sensitive and hover-sensitive input/output (i/o) interface. One example apparatus includes a touch detector that detects an object that touches the i/o interface. The example apparatus includes a proximity detector that detects an object in a hover-space associated with the i/o interface. The apparatus produces characterization data concerning the touch action and the hover action. The proximity detector and the touch detector may share a set of capacitive sensing nodes. Example apparatus and methods selectively control input/output actions on the i/o interface based on a combination of the touch action(s) and the hover action(s).Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: Microsoft CorporationInventors: Dan Hwang, Lynn Dai
-
Publication number: 20150077338Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: MICROSOFT CORPORATIONInventors: Lynn Dai, Dan Hwang
-
Publication number: 20150082216Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.Type: ApplicationFiled: September 16, 2013Publication date: March 19, 2015Applicant: Microsoft CorporationInventors: Lynn Dai, Dan Hwang, Paul Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
-
Patent number: D833350Type: GrantFiled: February 26, 2016Date of Patent: November 13, 2018Assignee: DENSO International America, Inc.Inventors: Heather Windel, Gareth Webb, Ronald Clogg, Feng Xue, Robert Wunsche, III, Dan Hwang, Ronald Woo